未验证 提交 d71f68a0 编写于 作者: H hua-zi 提交者: GitHub

【英文文档修复】updata Adam.py (#53424)

上级 72e235d0
......@@ -105,7 +105,7 @@ class Adam(Optimizer):
loss = paddle.mean(out)
adam = paddle.optimizer.Adam(learning_rate=0.1,
parameters=linear.parameters())
out.backward()
loss.backward()
adam.step()
adam.clear_grad()
......@@ -127,7 +127,7 @@ class Adam(Optimizer):
beta1=beta1,
beta2=beta2,
weight_decay=0.01)
out.backward()
loss.backward()
adam.step()
adam.clear_grad()
......@@ -150,7 +150,7 @@ class Adam(Optimizer):
}],
weight_decay=0.01,
beta1=0.9)
out.backward()
loss.backward()
adam.step()
adam.clear_grad()
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册