提交 d092a5a2 编写于 作者: W WenmuZhou

switch learning_rate and lr

上级 41b33c9e
......@@ -44,9 +44,9 @@ Optimizer:
name: Adam
beta1: 0.9
beta2: 0.999
learning_rate:
lr:
# name: Cosine
lr: 0.001
learning_rate: 0.001
# warmup_epoch: 0
regularizer:
name: 'L2'
......
......@@ -29,8 +29,8 @@ Optimizer:
name: Adam
beta1: 0.9
beta2: 0.999
learning_rate:
lr: 0.0005
lr:
learning_rate: 0.0005
regularizer:
name: 'L2'
factor: 0.00001
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册