diff --git a/recognize_digits/README.en.md b/recognize_digits/README.en.md index 335796383656b6e74200d8523e363dae1d07c67e..3c82ab1efdd2ab9fbb52928a4f8db370270e703a 100644 --- a/recognize_digits/README.en.md +++ b/recognize_digits/README.en.md @@ -42,7 +42,7 @@ In such a classification problem, we usually use the cross entropy loss function $$ crossentropy(label, y) = -\sum_i label_ilog(y_i) $$ -Fig. 2 shows a softmax regression network, with weights in black, and bias in red. +1 indicates bias is 1. +Fig. 2 shows a softmax regression network, with weights in blue, and bias in red. +1 indicates bias is 1.


@@ -57,7 +57,7 @@ The Softmax regression model described above uses the simplest two-layer neural 2. After the second hidden layer, we get $ H_2 = \phi(W_2H_1 + b_2) $. 3. Finally, after output layer, we get $Y=softmax(W_3H_2 + b_3)$, the final classification result vector. -Fig. 3. is Multilayer Perceptron network, with weights in black, and bias in red. +1 indicates bias is 1. +Fig. 3. is Multilayer Perceptron network, with weights in blue, and bias in red. +1 indicates bias is 1.


diff --git a/recognize_digits/index.en.html b/recognize_digits/index.en.html index f918bad7527bfad019ca73b39bc89798b275fe36..bec542ca357adc52da20bcc6a9eba26a2c7d580f 100644 --- a/recognize_digits/index.en.html +++ b/recognize_digits/index.en.html @@ -83,7 +83,7 @@ In such a classification problem, we usually use the cross entropy loss function $$ crossentropy(label, y) = -\sum_i label_ilog(y_i) $$ -Fig. 2 shows a softmax regression network, with weights in black, and bias in red. +1 indicates bias is 1. +Fig. 2 shows a softmax regression network, with weights in blue, and bias in red. +1 indicates bias is 1.


@@ -98,7 +98,7 @@ The Softmax regression model described above uses the simplest two-layer neural 2. After the second hidden layer, we get $ H_2 = \phi(W_2H_1 + b_2) $. 3. Finally, after output layer, we get $Y=softmax(W_3H_2 + b_3)$, the final classification result vector. -Fig. 3. is Multilayer Perceptron network, with weights in black, and bias in red. +1 indicates bias is 1. +Fig. 3. is Multilayer Perceptron network, with weights in blue, and bias in red. +1 indicates bias is 1.


diff --git a/recognize_digits/index.html b/recognize_digits/index.html index 81b42c15a02f29432bf4e0740fe05b7580a24a74..d34d1a7cbd87df66060692c63c5a8e2339c6e3ba 100644 --- a/recognize_digits/index.html +++ b/recognize_digits/index.html @@ -83,7 +83,7 @@ $$ y_i = softmax(\sum_j W_{i,j}x_j + b_i) $$ $$ crossentropy(label, y) = -\sum_i label_ilog(y_i) $$ -图2为softmax回归的网络图,图中权重用黑线表示、偏置用红线表示、+1代表偏置参数的系数为1。 +图2为softmax回归的网络图,图中权重用蓝线表示、偏置用红线表示、+1代表偏置参数的系数为1。


@@ -99,7 +99,7 @@ Softmax回归模型采用了最简单的两层神经网络,即只有输入层 3. 最后,再经过输出层,得到的$Y=softmax(W_3H_2 + b_3)$,即为最后的分类结果向量。 -图3为多层感知器的网络结构图,图中权重用黑线表示、偏置用红线表示、+1代表偏置参数的系数为1。 +图3为多层感知器的网络结构图,图中权重用蓝线表示、偏置用红线表示、+1代表偏置参数的系数为1。