update optimizer (#26711)
* update doc * update doc * fix optimizer sample code * add default value for adamw weight_decay * fix adamw * change LearningRateDecay to _LRScheduler * fix adamw;notest * fix load;notest * remove file * bug fix * fix code style * bug fix * add ut * adamw support weight_decay=0 * fix ut * fix set_lr doc * fix doc * change parameters place
Showing
想要评论请 注册 或 登录