未验证 提交 164b642b 编写于 作者: L Lv Mengsi 提交者: GitHub

update bn (#1653)

* update bn

* update
上级 4f1efa9d
......@@ -17,13 +17,15 @@ batch_norm
``input`` 是mini-batch的输入。
.. math::
\mu_{\beta} &\gets \frac{1}{m} \sum_{i=1}^{m} x_i \quad &// mini-batch-mean \\
\sigma_{\beta}^{2} &\gets \frac{1}{m} \sum_{i=1}^{m}(x_i - \mu_{\beta})^2 \quad &// mini-batch-variance \\
\hat{x_i} &\gets \frac{x_i - \mu_\beta} {\sqrt{\sigma_{\beta}^{2} + \epsilon}} \quad &// normalize \\
y_i &\gets \gamma \hat{x_i} + \beta \quad &// scale-and-shift
moving\_mean = moving\_mean * momentum + mini\_batch\_mean * (1. - momentum) \global mean
moving\_variance = moving\_variance * momentum + mini\_batch\_var * (1. - momentum) \global variance
\mu_{\beta} &\gets \frac{1}{m} \sum_{i=1}^{m} x_i \qquad &//\
\ mini-batch\ mean \\
\sigma_{\beta}^{2} &\gets \frac{1}{m} \sum_{i=1}^{m}(x_i - \mu_{\beta})^2 \qquad &//\
\ mini-batch\ variance \\
\hat{x_i} &\gets \frac{x_i - \mu_\beta} {\sqrt{\sigma_{\beta}^{2} + \epsilon}} \qquad &//\ normalize \\
y_i &\gets \gamma \hat{x_i} + \beta \qquad &//\ scale\ and\ shift
moving\_mean = moving\_mean * momentum + mini\_batch\_mean * (1. - momentum) \\
moving\_variance = moving\_variance * momentum + mini\_batch\_var * (1. - momentum)
moving_mean和moving_var是训练过程中统计得到的全局均值和方差,在预测或者评估中使用。
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册