提交 cd9dd380 编写于 作者: M mindspore-ci-bot 提交者: Gitee

!3313 delete annotation of decay filter in optimizers

Merge pull request !3313 from wangnan39/delete_annotation_of_decay_filter_in_optimizers
...@@ -398,8 +398,6 @@ class AdamWeightDecay(Optimizer): ...@@ -398,8 +398,6 @@ class AdamWeightDecay(Optimizer):
eps (float): Term added to the denominator to improve numerical stability. Default: 1e-6. eps (float): Term added to the denominator to improve numerical stability. Default: 1e-6.
Should be greater than 0. Should be greater than 0.
weight_decay (float): Weight decay (L2 penalty). It should be in range [0.0, 1.0]. Default: 0.0. weight_decay (float): Weight decay (L2 penalty). It should be in range [0.0, 1.0]. Default: 0.0.
decay_filter (Function): A function to determine whether to apply weight decay on parameters. Default:
lambda x: 'LayerNorm' not in x.name and 'bias' not in x.name.
Inputs: Inputs:
- **gradients** (tuple[Tensor]) - The gradients of `params`, the shape is the same as `params`. - **gradients** (tuple[Tensor]) - The gradients of `params`, the shape is the same as `params`.
......
...@@ -228,8 +228,6 @@ class Lamb(Optimizer): ...@@ -228,8 +228,6 @@ class Lamb(Optimizer):
eps (float): Term added to the denominator to improve numerical stability. Default: 1e-6. eps (float): Term added to the denominator to improve numerical stability. Default: 1e-6.
Should be greater than 0. Should be greater than 0.
weight_decay (float): Weight decay (L2 penalty). Default: 0.0. Should be in range [0.0, 1.0]. weight_decay (float): Weight decay (L2 penalty). Default: 0.0. Should be in range [0.0, 1.0].
decay_filter (Function): A function to determine whether to apply weight decay on parameters. Default:
lambda x: 'LayerNorm' not in x.name and 'bias' not in x.name.
Inputs: Inputs:
- **gradients** (tuple[Tensor]) - The gradients of `params`, the shape is the same as `params`. - **gradients** (tuple[Tensor]) - The gradients of `params`, the shape is the same as `params`.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册