- 14 9月, 2021 1 次提交
-
-
由 zhaoyingli 提交于
* add layerwise learning rate for adamw * fix format * add unitest * add NotImplementedError * add gpu unitest * update gpuplace
-
- 23 8月, 2021 1 次提交
-
-
由 zhaoyingli 提交于
* adamw support cuda * adamw support cuda
-
- 30 7月, 2021 1 次提交
-
-
由 wangguanzhong 提交于
* fix lr in param group * add unittest for adamw
-
- 31 5月, 2021 1 次提交
-
-
由 wangguanzhong 提交于
* support params groups, test=develop * simplify updating opt attr * update according to review
-
- 19 1月, 2021 1 次提交
-
-
由 WangXi 提交于
-
- 07 1月, 2021 1 次提交
-
-
由 WangXi 提交于
-
- 02 11月, 2020 1 次提交
-
-
由 Guo Sheng 提交于
* Fix lr setting of AdamW when lr is an instance of LRScheduler. test=develop * Fix static graph test mode in test_adamw_op.py. test=develop
-
- 27 9月, 2020 1 次提交
-
-
由 Zhou Wei 提交于
-
- 01 9月, 2020 1 次提交
-
-
由 MRXLT 提交于
* update doc * update doc * fix optimizer sample code * add default value for adamw weight_decay * fix adamw * change LearningRateDecay to _LRScheduler * fix adamw;notest * fix load;notest * remove file * bug fix * fix code style * bug fix * add ut * adamw support weight_decay=0 * fix ut * fix set_lr doc * fix doc * change parameters place
-
- 28 8月, 2020 1 次提交
-
-
由 donproc 提交于
-
- 23 8月, 2020 1 次提交
-
-
由 MRXLT 提交于
refine Optimizer/Adam/Admax/RMSProp && add Admw * buf fix * update comment * unify arguments place; notest * fix ut, test=develop * bug fix * fix conflicts, test=develop * add examples code * bug fix * fix comments * fix sample code * add sample code for Optimizer * add adamax ut, test=develop * fix rmsprop ut, test=develop * add ut for optimizer.py and adamw.py * remove TestAdamOptimizerBetaVariable * update api && add ut * update doc && fix ut * add ut Co-authored-by: Nmapingshuo <mps2012@yeah.net>
-