提交 5cf7da2a 编写于 作者: J JiabinYang

refine the comments and result format for book 02.recognize_digits

上级 951563a6
......@@ -50,7 +50,7 @@ For an $N$-class classification problem with $N$ output nodes, Softmax normalize
In such a classification problem, we usually use the cross entropy loss function:
$$ \text{crossentropy}(label, y) = -\sum_i label_ilog(y_i) $$
$$ \text{_L_<sub>cross-entropy</sub>}(label, y) = -\sum_i label_ilog(y_i) $$
Fig. 2 illustrates a softmax regression network, with the weights in blue, and the bias in red. `+1` indicates that the bias is $1$.
......@@ -432,7 +432,7 @@ Now we are ready to do inference.
```python
results = inferencer.infer({'img': img})
lab = np.argsort(results) # probs and lab are the results of one batch data
print "Label of image/infer_3.png is: %d" % lab[0][0][-1]
print "Inference result of image/infer_3.png is: %d" % lab[0][0][-1]
```
......
......@@ -92,7 +92,7 @@ For an $N$-class classification problem with $N$ output nodes, Softmax normalize
In such a classification problem, we usually use the cross entropy loss function:
$$ \text{crossentropy}(label, y) = -\sum_i label_ilog(y_i) $$
$$ \text{_L_<sub>cross-entropy</sub>}(label, y) = -\sum_i label_ilog(y_i) $$
Fig. 2 illustrates a softmax regression network, with the weights in blue, and the bias in red. `+1` indicates that the bias is $1$.
......@@ -474,7 +474,7 @@ Now we are ready to do inference.
```python
results = inferencer.infer({'img': img})
lab = np.argsort(results) # probs and lab are the results of one batch data
print "Label of image/infer_3.png is: %d" % lab[0][0][-1]
print "Inference result of image/infer_3.png is: %d" % lab[0][0][-1]
```
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册