1. 22 7月, 2023 2 次提交
  2. 20 7月, 2023 4 次提交
  3. 19 7月, 2023 4 次提交
  4. 14 7月, 2023 1 次提交
    • C
      [AutoTuner] Distribute best cfg (#54834) · 7f6d222f
      caozhou 提交于
      * distribute best cfg
      
      * adapt to multi args transmission
      
      * update metric extracting
      
      * fix bugs of prune and reading log
      
      * fix time default value
      
      * remove time record
      
      * adjust the order of searching dim
      
      * fix prune bugs
      
      * fix adding cfg bug
      
      * fix multi nodes bug
      
      * reset status
      
      * remove alarm and set logdir
      
      * deepcopy ctx
      
      * change alarm
      
      * fix restart bug
      
      * add exit
      
      * best no need alarm
      
      * add warmup time
      7f6d222f
  5. 13 7月, 2023 5 次提交
  6. 11 7月, 2023 3 次提交
    • P
      support sharding parallel (#54634) · b7a05057
      pangengzheng 提交于
      * support sharding parallel
      
      * fix name
      
      * fix
      
      * update
      
      * test amp for sharding
      
      ---------
      
      Co-authored-by: pangengzheng <pangengzheng.baidu.com>
      b7a05057
    • W
      Pipeline pass base (#55174) · 5434560a
      Wennie396 提交于
      * format correction
      
      * variable names adjustment
      
      * variable names adjustment, name-->type, value-->sub_program
      5434560a
    • L
      replace the AdagradOptimizer... · 94365855
      LoneRanger 提交于
      replace the AdagradOptimizer 、adamaxOptimizer、AdadeltaOptimizer、RMSPropOptimizer、LambOptimizer and Momentum (#54152)
      
      * replace the AdadeltaOptimizer with Adadelta
      
      * replace the RMSPropOptimizer with RMSProp
      
      * replace the LambOptimizer with lamb
      
      * replace the momentum in contrib/optimizer.py with Momentum in python/paddle/optimizer/momentum.py
      
      * fix bug
      
      * fix bug
      
      * fix bug
      
      * fix bug of Lamp
      
      * fix bug of Lamp
      
      * fix bug of import
      
      * replace the AdamaxOptimizer with Admax and change the optimizer base for AdagradOptimizer
      
      * fix bug
      
      * fix bug
      
      * Update optimizer.py
      
      * fix bug
      
      * fix bug
      94365855
  7. 06 7月, 2023 1 次提交
  8. 03 7月, 2023 1 次提交
  9. 30 6月, 2023 1 次提交
  10. 29 6月, 2023 3 次提交
  11. 28 6月, 2023 2 次提交
  12. 27 6月, 2023 2 次提交
  13. 25 6月, 2023 2 次提交
  14. 20 6月, 2023 2 次提交
  15. 19 6月, 2023 1 次提交
  16. 16 6月, 2023 3 次提交
  17. 15 6月, 2023 3 次提交