1. 07 1月, 2021 1 次提交
  2. 02 11月, 2020 1 次提交
  3. 27 9月, 2020 1 次提交
  4. 01 9月, 2020 1 次提交
    • M
      update optimizer (#26711) · 1f36d3cd
      MRXLT 提交于
      * update doc
      
      * update doc
      
      * fix optimizer sample code
      
      * add default value for adamw weight_decay
      
      * fix adamw
      
      * change LearningRateDecay to _LRScheduler
      
      * fix adamw;notest
      
      * fix load;notest
      
      * remove file
      
      * bug fix
      
      * fix code style
      
      * bug fix
      
      * add ut
      
      * adamw support weight_decay=0
      
      * fix ut
      
      * fix set_lr doc
      
      * fix doc
      
      * change parameters place
      1f36d3cd
  5. 28 8月, 2020 1 次提交
  6. 23 8月, 2020 1 次提交
    • M
      [WIP] update optimizer for 2.0 (#26288) · eeda90d6
      MRXLT 提交于
      refine Optimizer/Adam/Admax/RMSProp && add Admw
      
      * buf fix
      
      * update comment
      
      * unify arguments place; notest
      
      * fix ut, test=develop
      
      * bug fix
      
      * fix conflicts, test=develop
      
      * add examples code
      
      * bug fix
      
      * fix comments
      
      * fix sample code
      
      * add sample code for Optimizer
      
      * add adamax ut, test=develop
      
      * fix rmsprop ut, test=develop
      
      * add ut for optimizer.py and adamw.py
      
      * remove TestAdamOptimizerBetaVariable
      
      * update api && add ut
      
      * update doc && fix ut
      
      * add ut
      Co-authored-by: Nmapingshuo <mps2012@yeah.net>
      eeda90d6