提交 fe5de04b 编写于 作者: Q qiaolongfei

optimize doc for MomentumOptimizer

上级 cfc6338a
......@@ -323,11 +323,11 @@ class MomentumOptimizer(Optimizer):
& if (use\_nesterov):
& param = param - gradient * learning\_rate + mu * velocity * learning\_rate
&\quad param = param - gradient * learning\_rate + mu * velocity * learning\_rate
& else:
& param = param - learning\_rate * velocity
&\quad param = param - learning\_rate * velocity
Args:
learning_rate (float|Variable): the learning rate used to update parameters. \
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册