学习率如何调整?在yml文件中设置没有改变,一直都是0.0005
已关闭
学习率如何调整?在yml文件中设置没有改变,一直都是0.0005
Created by: gekie
如果说是动态调整,跑了一天一夜,LR还是0.0005,没看到有变化的log
Created by: dyning
adam是自适应优化算法,学习率是内部调整的,上面显示的只是基础学习率。SGD+动量的修改方式可以参考如下: (1)https://github.com/PaddlePaddle/PaddleOCR/blob/develop/ppocr/optimizer.py 这个文件中添加优化器的定义:
def PiecewiseDecay(params): base_lr = params['base_lr'] gamma = params['gamma'] steps = params['steps'] momentum_rate = params['momentum_rate'] L2_decay_weight = params['L2_decay_weight'] bd = steps lr = [base_lr * (0.1**i) for i in range(len(steps) + 1)] learning_rate = fluid.layers.piecewise_decay(boundaries=bd, values=lr) optimizer = fluid.optimizer.Momentum( learning_rate=learning_rate, momentum=momentum_rate, regularization=fluid.regularizer.L2Decay(L2_decay_weight)) return optimizer
(2)然后在yml文件中修改优化器配置, 将 Optimizer: function: ppocr.optimizer,AdamDecay base_lr: 0.001 beta1: 0.9 beta2: 0.999 改为 Optimizer: function: ppocr.optimizer,PiecewiseDecay base_lr: 0.001 gamma: 0.1 steps: [300000] momentum_rate: 0.9 L2_decay_weight: 0.0004