提交 63911826 编写于 作者: S shippingwang 提交者: ceci3

fixed typo, test=develop

上级 4449e855
...@@ -313,9 +313,11 @@ def cosine_decay(learning_rate, step_each_epoch, epochs): ...@@ -313,9 +313,11 @@ def cosine_decay(learning_rate, step_each_epoch, epochs):
""" """
Applies cosine decay to the learning rate. Applies cosine decay to the learning rate.
when training a model, it is oftem recommended to lower the learning rate as the when training a model, it is often recommended to lower the learning rate as the
training progresses. By using this function, the learning rate will be decayed by training progresses. By using this function, the learning rate will be decayed by
following cosine decay strategy. following cosine decay strategy.
decayed_lr = learning_rate * 0.5 * (math.cos(epoch * math.pi / epochs) + 1)
Args: Args:
learning_rate(Variable|float): The initial learning rate. learning_rate(Variable|float): The initial learning rate.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册