如何在模型中添加regularization loss?添加后如何在训练时获取该loss?
Created by: wenston2006
请问regularization loss是否可通过某些optimizer中的regularization参数添加(如:class paddle.fluid.optimizer.LarsMomentumOptimizer(learning_rate, momentum, lars_coeff=0.001, lars_weight_decay=0.0005, regularization=None, name=None))? 此外如何获取训练中的regularization loss?