diff --git a/02.recognize_digits/README.md b/02.recognize_digits/README.md
index 412c7ffc12690439cf156e753b429ea30487485f..b547e417fc486870be206632b33c5e4dfacf7a90 100644
--- a/02.recognize_digits/README.md
+++ b/02.recognize_digits/README.md
@@ -50,7 +50,7 @@ For an $N$-class classification problem with $N$ output nodes, Softmax normalize
In such a classification problem, we usually use the cross entropy loss function:
-$$ \text{crossentropy}(label, y) = -\sum_i label_ilog(y_i) $$
+$$ \text{_L_cross-entropy}(label, y) = -\sum_i label_ilog(y_i) $$
Fig. 2 illustrates a softmax regression network, with the weights in blue, and the bias in red. `+1` indicates that the bias is $1$.
@@ -432,7 +432,7 @@ Now we are ready to do inference.
```python
results = inferencer.infer({'img': img})
lab = np.argsort(results) # probs and lab are the results of one batch data
-print "Label of image/infer_3.png is: %d" % lab[0][0][-1]
+print "Inference result of image/infer_3.png is: %d" % lab[0][0][-1]
```
diff --git a/02.recognize_digits/index.html b/02.recognize_digits/index.html
index 2102b759380116b3f3e487f01a91fc476ba75330..9f0d51fddb394b9bc1909920a478e399d275b0dd 100644
--- a/02.recognize_digits/index.html
+++ b/02.recognize_digits/index.html
@@ -92,7 +92,7 @@ For an $N$-class classification problem with $N$ output nodes, Softmax normalize
In such a classification problem, we usually use the cross entropy loss function:
-$$ \text{crossentropy}(label, y) = -\sum_i label_ilog(y_i) $$
+$$ \text{_L_cross-entropy}(label, y) = -\sum_i label_ilog(y_i) $$
Fig. 2 illustrates a softmax regression network, with the weights in blue, and the bias in red. `+1` indicates that the bias is $1$.
@@ -474,7 +474,7 @@ Now we are ready to do inference.
```python
results = inferencer.infer({'img': img})
lab = np.argsort(results) # probs and lab are the results of one batch data
-print "Label of image/infer_3.png is: %d" % lab[0][0][-1]
+print "Inference result of image/infer_3.png is: %d" % lab[0][0][-1]
```