Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
magicwindyyd
mindspore
提交
95ef7df7
M
mindspore
项目概览
magicwindyyd
/
mindspore
与 Fork 源项目一致
Fork自
MindSpore / mindspore
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
M
mindspore
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
95ef7df7
编写于
7月 23, 2020
作者:
李
李嘉琪
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
add single quotes, modify the formula and parameters in the comment
上级
c2408090
变更
1
隐藏空白更改
内联
并排
Showing
1 changed file
with
8 addition
and
9 deletion
+8
-9
mindspore/nn/learning_rate_schedule.py
mindspore/nn/learning_rate_schedule.py
+8
-9
未找到文件。
mindspore/nn/learning_rate_schedule.py
浏览文件 @
95ef7df7
...
...
@@ -59,7 +59,7 @@ class ExponentialDecayLR(LearningRateSchedule):
For the i-th step, the formula of computing decayed_learning_rate[i] is:
.. math::
decayed\_learning\_rate[i] = learning\_rate * decay\_rate^{p}
}
decayed\_learning\_rate[i] = learning\_rate * decay\_rate^{p}
Where :math:`p = \frac{current\_step}{decay\_steps}`, if `is_stair` is True, The formula
is :math:`p = floor(\frac{current\_step}{decay\_steps})`.
...
...
@@ -158,7 +158,7 @@ class InverseDecayLR(LearningRateSchedule):
For the i-th step, the formula of computing decayed_learning_rate[i] is:
.. math::
decayed\_learning\_rate[i] = learning\_rate / (1 + decay\_rate * p
}
decayed\_learning\_rate[i] = learning\_rate / (1 + decay\_rate * p
)
Where :math:`p = \frac{current\_step}{decay\_steps}`, if `is_stair` is True, The formula
is :math:`p = floor(\frac{current\_step}{decay\_steps})`.
...
...
@@ -166,7 +166,7 @@ class InverseDecayLR(LearningRateSchedule):
Args:
learning_rate (float): The initial value of learning rate.
decay_rate (float): The decay rate.
decay_
epoch
(int): A value used to calculate decayed learning rate.
decay_
steps
(int): A value used to calculate decayed learning rate.
is_stair (bool): If true, learning rate decay once every `decay_steps` times. Default: False.
Inputs:
...
...
@@ -207,9 +207,8 @@ class CosineDecayLR(LearningRateSchedule):
.. math::
decayed\_learning\_rate[i] = min\_learning\_rate + 0.5 * (max\_learning\_rate - min\_learning\_rate) *
(1 + cos(\frac{current\_
epoch}{decay\_epoch
}\pi))
(1 + cos(\frac{current\_
step}{decay\_steps
}\pi))
Where :math:`current\_epoch=floor(\frac{i}{step\_per\_epoch})`.
Args:
min_lr (float): The minimum value of learning rate.
...
...
@@ -262,11 +261,11 @@ class PolynomialDecayLR(LearningRateSchedule):
.. math::
decayed\_learning\_rate[i] = (learning\_rate - end\_learning\_rate) *
(1 - tmp\_step / tmp\_decay\_step)^{power} + end\_learning\_rate
(1 - tmp\_step / tmp\_decay\_step
s
)^{power} + end\_learning\_rate
Where :math:`tmp\_step=min(
global\_step, decay\_step
).
Where :math:`tmp\_step=min(
current\_step, decay\_steps
).
If `update_decay_steps` is true, update the value of `tmp_decay_step` every `decay_steps`. The formula
is :math:`tmp\_decay\_step
= decay\_step * ceil(global
\_step / decay\_steps)`
is :math:`tmp\_decay\_step
s = decay\_steps * ceil(current
\_step / decay\_steps)`
Args:
learning_rate (float): The initial value of learning rate.
...
...
@@ -335,7 +334,7 @@ class WarmUpLR(LearningRateSchedule):
.. math::
warmup\_learning\_rate[i] = learning\_rate * tmp\_step / warmup\_steps
Where :math:`tmp\_step=min(
global\_step, warmup\_steps)
.
Where :math:`tmp\_step=min(
current\_step, warmup\_steps)`
.
Args:
learning_rate (float): The initial value of learning rate.
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录