提交 733da7b2 编写于 作者: S shippingwang

fixed typo, test=develop

上级 33982932
...@@ -313,10 +313,12 @@ def cosine_decay(learning_rate, step_each_epoch, epochs): ...@@ -313,10 +313,12 @@ def cosine_decay(learning_rate, step_each_epoch, epochs):
""" """
Applies cosine decay to the learning rate. Applies cosine decay to the learning rate.
when training a model, it is oftem recommended to lower the learning rate as the when training a model, it is often recommended to lower the learning rate as the
training progresses. By using this function, the learning rate will be decayed by training progresses. By using this function, the learning rate will be decayed by
following cosine decay strategy. following cosine decay strategy.
decayed_lr = learning_rate * 0.5 * (math.cos(epoch * math.pi / epochs) + 1)
Args: Args:
learning_rate(Variable|float): The initial learning rate. learning_rate(Variable|float): The initial learning rate.
step_each_epoch(int): the number of steps in an epoch. step_each_epoch(int): the number of steps in an epoch.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册