1. 27 9月, 2022 1 次提交
  2. 20 9月, 2022 1 次提交
  3. 14 9月, 2022 2 次提交
  4. 29 8月, 2022 1 次提交
  5. 28 6月, 2022 1 次提交
  6. 27 6月, 2022 1 次提交
  7. 23 6月, 2022 1 次提交
  8. 05 6月, 2022 1 次提交
    • S
      【code format check upgrade】 step2:yapf (#42944) · a072fca8
      Sing_chan 提交于
      * use yapf to format all python file
      
      * yapf exclude two unittests file for they rely on writing and reading file, and format will break them
      
      * disable diff_py_file because too many diff files cause command following failed
      a072fca8
  9. 28 4月, 2022 1 次提交
  10. 25 3月, 2022 2 次提交
    • Z
      fix sync_bn error in fp16 amp-o2 (#40943) · 9ab3c76b
      zhangbo9674 提交于
      9ab3c76b
    • J
      Refactor Dygraph Flags (#40786) · 3085d5e4
      Jiabin Yang 提交于
      * refactor eager flags
      
      * fix flags error when we switch from eager to dygraph
      
      * fix ci problem
      
      * fix ci
      
      * fix ci
      
      * merge develop and fix code style
      
      * merge develop and fix code style
      
      * fix op test error
      
      * fix op test error
      
      * fix op test error
      
      * fix op test error
      
      * fix op test error
      
      * merge develop
      3085d5e4
  11. 16 3月, 2022 1 次提交
  12. 15 3月, 2022 1 次提交
  13. 07 3月, 2022 1 次提交
  14. 28 2月, 2022 1 次提交
  15. 27 2月, 2022 1 次提交
  16. 23 2月, 2022 1 次提交
  17. 22 2月, 2022 1 次提交
  18. 18 2月, 2022 1 次提交
    • Z
      [AMP] support GPU BF16 amp for dygraph (#39029) · 7d6d3848
      zhangbo9674 提交于
      * support dtype param for auto_cast
      
      * add amp_dtype for tracer
      
      * add unsupported bf16 list
      
      * support bf16 amp for O2
      
      * refine python interface for bfloat16
      
      * refine code
      
      * refine code
      
      * refine unittest
      
      * refine code
      
      * refine code
      
      * add bf16 o1
      
      * refine code by comment
      
      * add gradient accumulator
      
      * add recompute
      7d6d3848
  19. 11 1月, 2022 1 次提交
  20. 29 12月, 2021 1 次提交
  21. 28 12月, 2021 1 次提交
  22. 27 12月, 2021 1 次提交
  23. 15 12月, 2021 1 次提交
  24. 29 11月, 2021 1 次提交
  25. 24 11月, 2021 1 次提交
    • 0
      [Dy2stat]support pure fp16 for dy2stat (#36944) · 52edad6a
      0x45f 提交于
      * run dy2stat pure fp16 in Linear model
      
      * no use self._pure_fp16_inputs
      
      * add test and fix Adam error in dy2stat pure fp16 training
      
      * use paddle.optimizer.Adam
      
      * run test in gpu
      
      * change test time for CI
      
      * enlarge atol for test_resnet_pure_fp16
      
      * refine code and enlarge atol
      
      * make custom_white_list and custom_black_list take effect for AMP and pure fp16
      
      * check tracer is not None
      
      * use default atol
      
      * change filter_size
      
      * change atol and add some NOTE
      52edad6a
  26. 09 11月, 2021 1 次提交
  27. 22 10月, 2021 1 次提交
    • L
      [hapi] support dygraph amp O2 (#36441) · 08248db0
      Leo Chen 提交于
      * [hapi] support dygrapg amp O2
      
      * fix problem of static pure fp16 in hapi
      
      * fix bug
      
      * fix format
      
      * fix ut
      
      * follow comments
      
      * update ut
      
      * update amp save/load
      
      * fix ut
      
      * refine code format
      08248db0
  28. 13 10月, 2021 2 次提交
  29. 22 9月, 2021 1 次提交
  30. 17 9月, 2021 1 次提交
    • Z
      [AMP] Support pure fp16 training mode for dygraph (#35521) · adaeee4d
      zhangbo9674 提交于
      * add pure fp16 major function in auto_cast & tracer
      
      * support master weight in dygraph for pure fp16
      
      * check mix dtype of fp16&fp32 for check_finite_and_unscale op
      
      * change pure fp16 funtion name
      
      * refine some bug in auto_cast
      
      * refine auto_cast interface logic
      
      * add param _casted_by_pure_fp16 for class Layer
      
      * support state_dict hook for save model by user appointed dtype in pure_fp16_decorator
      
      * refine pure_fp16_decorator as decorator
      
      * add unittest
      
      * add comment
      
      * add comment
      
      * support recompute
      
      * add comment for auto_cast and decorator
      
      * support to_static_state_dict for paddle.jit.save
      
      * unlimite models num and optimizers num
      
      * add lookup_table in black_list
      
      * fix momentum and layer state_dict
      
      * fix bug in layer state_dict
      
      * fix bug in layer state_dict_helper
      
      * refine unittest
      
      * refine test_momentun_op
      
      * refine interface and some code
      
      * refine amp_decorator interface
      
      * refine pure fp16 interface
      
      * refine master weight interface
      adaeee4d
  31. 10 9月, 2021 1 次提交
  32. 05 8月, 2021 1 次提交
  33. 05 7月, 2021 1 次提交
  34. 29 6月, 2021 1 次提交
  35. 21 6月, 2021 1 次提交
  36. 18 11月, 2020 1 次提交
  37. 13 8月, 2020 1 次提交
    • L
      Feature/Enable Auto-Mixed-Precision in dynamic graph (#24903) · 2d95280e
      Leo Chen 提交于
      * add auto_cast, test=develop
      
      * add loss scaler, test=develop
      
      * add comments, test=develop
      
      * refine code, test=develop
      
      * refine code, test=develop
      
      * do not set flags automatically, test=develop
      
      * fix custom op bug, test=develop
      
      * add more test, test=develop
      
      * refine enable logic, test=develop
      
      * enable amp test with GPU, test=develop
      
      * add unittest
      
      * add test for found_inf
      
      * follow comments
      
      * follow comments
      
      * remove global variable, use singleton
      
      * add some notes
      
      * update comments
      
      * update comments
      
      * update comments
      
      * add use_dynamic_loss_scaling argument
      
      * refine found_inf
      
      * refine found_inf
      2d95280e