• M
    update optimizer (#26711) · 1f36d3cd
    MRXLT 提交于
    * update doc
    
    * update doc
    
    * fix optimizer sample code
    
    * add default value for adamw weight_decay
    
    * fix adamw
    
    * change LearningRateDecay to _LRScheduler
    
    * fix adamw;notest
    
    * fix load;notest
    
    * remove file
    
    * bug fix
    
    * fix code style
    
    * bug fix
    
    * add ut
    
    * adamw support weight_decay=0
    
    * fix ut
    
    * fix set_lr doc
    
    * fix doc
    
    * change parameters place
    1f36d3cd
adam.py 10.5 KB