Created by: zhouwei25
当前问题
- Paddle目前在fluid.ParamAttr和optimizer中均可以设置正则化,存在冲突,容易误导用户。目前的逻辑是:ParamAttr优先级高于optimizer
# 在参数中设置正则化L1
l1_regular = fluid.regularizer.L1Decay(0.01)
w_param = fluid.ParamAttr(regularizer=l1_regular)
loss = fluid.layers.fc(x, 1, param_attr=w_param)
# 在优化器中设置正则化L2
l2_regular = fluid.regularizer.L2Decay(0.01)
sgd_optimizer = fluid.optimizer.SGD(0.01, regularization=l2_regular)
sgd_optimizer.minimize(loss)
# 该冲突情形下,生效的是L1正则化,但对用户无提示
- 用户容易产生一些疑问,不知道冲突时生效的是哪一种正则化,对用户也无任何提示。
修改方案
保留两种,优先级不变,添加logging.info信息提示、文档说明、示例代码
- 添加logging.info提示信息,指出生效的正则化
- 在ParamAttr、Optimizer的API文档进行说明与示例
logging.info("Regularization of [fc_0.w_0, fc_1.w_0] have been set by ParamAttr or
WeightNormParamAttr already. So, the Regularization of Optimizer will not take effect
for these parameters!")