• J
    Adadelta Optimizer (#26590) · a1b99fae
    Jiawei Wang 提交于
    * add doc; notest
    
    * fix doc; notest
    
    * update doc; notest
    
    * refine optimizer && adam
    
    * refine optimizer; notest
    
    * add adam
    
    * fix doc
    
    * fix doc && add adamw; notest
    
    * add error message
    
    * bug fix
    
    * refine rmsprop && adamax
    
    * fix ci
    
    * buf fix
    
    * update comment
    
    * unify arguments place; notest
    
    * fix ut, test=develop
    
    * bug fix
    
    * fix conflicts, test=develop
    
    * add examples code
    
    * bug fix
    
    * fix comments
    
    * fix sample code
    
    * add sample code for Optimizer
    
    * add adamax ut, test=develop
    
    * fix rmsprop ut, test=develop
    
    * add ut for optimizer.py and adamw.py
    
    * first commit of adadelta optimizer
    
    * fix learning rate
    
    * fix adadelta doc and add sgd momentum
    
    * remove unused fluid
    
    * fix codestyle
    
    * Update test_adam_op.py
    
    * Update test_adam_op.py
    
    * fix SGD in 2 unittests
    
    * fix SGD in 2 unittests
    
    * fix ci
    
    * fix ut
    Co-authored-by: NMRXLT <xlt2024@gmail.com>
    Co-authored-by: Nmapingshuo <mps2012@yeah.net>
    a1b99fae
sgd.py 4.5 KB