1. 15 3月, 2023 1 次提交
  2. 10 3月, 2023 1 次提交
  3. 06 3月, 2023 1 次提交
  4. 24 2月, 2023 1 次提交
    • W
      Revert grad scale optimization pr (#50839) · 8a503522
      Weilong Wu 提交于
      * Revert "fixoptminizer _set_auxiliary_var bug (#50335)"
      
      This reverts commit c44005f0.
      
      * Revert "refine optimizer create accumulators (#50188)"
      
      This reverts commit 244e7546.
      
      * Revert "fix found_inf bug for custom optimizer (#50158)"
      
      This reverts commit 64573f9f.
      
      * Revert "refine amp scaler found_inf (#49864)"
      
      This reverts commit 382e9a06.
      
      * fix code format
      
      * fix conflict
      8a503522
  5. 06 2月, 2023 1 次提交
  6. 30 12月, 2022 1 次提交
  7. 25 12月, 2022 1 次提交
  8. 29 11月, 2022 1 次提交
  9. 03 11月, 2022 2 次提交
  10. 23 10月, 2022 1 次提交
  11. 10 10月, 2022 1 次提交
  12. 14 9月, 2022 1 次提交
  13. 26 8月, 2022 1 次提交
  14. 22 8月, 2022 1 次提交
  15. 05 6月, 2022 1 次提交
    • S
      【code format check upgrade】 step2:yapf (#42944) · a072fca8
      Sing_chan 提交于
      * use yapf to format all python file
      
      * yapf exclude two unittests file for they rely on writing and reading file, and format will break them
      
      * disable diff_py_file because too many diff files cause command following failed
      a072fca8
  16. 02 8月, 2021 1 次提交
  17. 31 5月, 2021 1 次提交
  18. 29 4月, 2021 1 次提交
  19. 27 4月, 2021 1 次提交
  20. 23 4月, 2021 1 次提交
  21. 01 12月, 2020 1 次提交
  22. 24 11月, 2020 1 次提交
    • L
      Upgrade string literals to raw string (#28989) · 3815d7aa
      Leo Chen 提交于
      * upgrade comment string to raw string
      
      * fix string in
      
      * fix string with ' '
      
      * revert update on comments
      
      * upgrade only necessary
      
      * fix sample code checker
      
      * fix comments with '''
      3815d7aa
  23. 29 8月, 2020 1 次提交
    • J
      Adadelta Optimizer (#26590) · a1b99fae
      Jiawei Wang 提交于
      * add doc; notest
      
      * fix doc; notest
      
      * update doc; notest
      
      * refine optimizer && adam
      
      * refine optimizer; notest
      
      * add adam
      
      * fix doc
      
      * fix doc && add adamw; notest
      
      * add error message
      
      * bug fix
      
      * refine rmsprop && adamax
      
      * fix ci
      
      * buf fix
      
      * update comment
      
      * unify arguments place; notest
      
      * fix ut, test=develop
      
      * bug fix
      
      * fix conflicts, test=develop
      
      * add examples code
      
      * bug fix
      
      * fix comments
      
      * fix sample code
      
      * add sample code for Optimizer
      
      * add adamax ut, test=develop
      
      * fix rmsprop ut, test=develop
      
      * add ut for optimizer.py and adamw.py
      
      * first commit of adadelta optimizer
      
      * fix learning rate
      
      * fix adadelta doc and add sgd momentum
      
      * remove unused fluid
      
      * fix codestyle
      
      * Update test_adam_op.py
      
      * Update test_adam_op.py
      
      * fix SGD in 2 unittests
      
      * fix SGD in 2 unittests
      
      * fix ci
      
      * fix ut
      Co-authored-by: NMRXLT <xlt2024@gmail.com>
      Co-authored-by: Nmapingshuo <mps2012@yeah.net>
      a1b99fae