提交 2959783b 编写于 作者: L LielinJiang

fix lr scheduler

上级 88a687e9
...@@ -12,7 +12,7 @@ def build_lr_scheduler(cfg): ...@@ -12,7 +12,7 @@ def build_lr_scheduler(cfg):
0, epoch + 1 - cfg.start_epoch) / float(cfg.decay_epochs + 1) 0, epoch + 1 - cfg.start_epoch) / float(cfg.decay_epochs + 1)
return lr_l return lr_l
scheduler = paddle.optimizer.lr.LambdaLR(cfg.learning_rate, scheduler = paddle.optimizer.lr.LambdaDecay(cfg.learning_rate,
lr_lambda=lambda_rule) lr_lambda=lambda_rule)
return scheduler return scheduler
else: else:
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册