1. 01 6月, 2022 3 次提交
    • J
      [AutoParallel & Science] Miscellaneous improvements (#43139) · f59bcb1c
      JZ-LIANG 提交于
      * adapt for 10 loss
      
      * partitioner support optimizer
      f59bcb1c
    • C
      add some comp op costs (#43114) · bd018360
      caozhou 提交于
      bd018360
    • Y
      [Auto Parallel] Add miscellaneous improvements (#43108) · 010aba33
      Yulong Ao 提交于
      * [Auto Parallel] Add the parallel tuner
      
      * [Auto Parallel] Improve the parallel tuner and fix some bugs
      
      * upodate cost model
      
      * update import Resharder by dist op
      
      * update cost model
      
      * fix comp cost bug
      
      * update cost model
      
      * [Auto Parallel] Amend the dist attr for #processses=1
      
      * update cost model and tuner
      
      * update cost model and tuner
      
      * update cost model and tuner
      
      * update cluster
      
      * update reshard
      
      * [Auto Parallel] Add the estimation from the cost model
      
      * [Auto Parallel] Reimplement the backup and restore functions
      
      * [Auto Parallel] Fix the bugs of the parallel tuner
      
      * [Auto Parallel] Update the engine api and dist context
      
      * [Auto Parallel] Work around the high order grad problem
      
      * [Auto Parallel] Add some miscellaneous improvements
      
      * [Auto Parallel] Add a unittest for DistributedContext
      Co-authored-by: Ncaozhou <caozhou@radi.ac.cn>
      010aba33
  2. 30 5月, 2022 1 次提交
  3. 19 5月, 2022 2 次提交
  4. 18 5月, 2022 1 次提交
  5. 13 5月, 2022 1 次提交
  6. 12 5月, 2022 1 次提交
  7. 10 5月, 2022 2 次提交
  8. 07 5月, 2022 1 次提交
  9. 06 5月, 2022 1 次提交
    • Z
      [AutoParallel] adapt for 2d laplace (#41601) · c043a21b
      zhaoyingli 提交于
      * add default_ctx in backward.py
      
      * record grad_var_to_var with grad_times
      
      * fix backward
      
      * update annotation
      
      * add complete_high_order_grad in complete_forward
      
      * add dist slice op
      
      * update grad_var_to_var type
      
      * update partition_block init mapping before loss op
      
      * update compatible for 'XShape' & update 'allreduce_vars'
      
      * add dist reshape op when input dim equal to output dim
      
      * update 'set_grad_var_shape' with grad_var_to_var
      
      * fix dist slice
      
      * fix set_grad_var_shape
      
      * add dist pnorm op
      
      * fix dist pnorm dist_attr
      
      * fix engine startprogram & adapt highorder grad
      
      * fix set_grad_var_shape when mp
      
      * update unittest
      
      * update cmakelist
      
      * default strategy in engine: dp
      
      * bug fix
      
      * tiny fix
      
      * flatten outputs
      
      * fix default strategy
      
      * init default ctx
      
      * tiny fix
      
      * test=allcase
      c043a21b
  10. 19 4月, 2022 2 次提交
  11. 18 4月, 2022 3 次提交
  12. 15 4月, 2022 1 次提交
  13. 30 3月, 2022 2 次提交
  14. 28 3月, 2022 2 次提交
  15. 25 3月, 2022 2 次提交
  16. 24 3月, 2022 1 次提交
  17. 23 3月, 2022 2 次提交
  18. 16 3月, 2022 1 次提交
    • Y
      [Auto Parallel] Add the support for the auto completion of while_op (#39939) · ec6b8fbd
      Yulong Ao 提交于
      * [Auto Parallel] Support the auto completion of while_op
      
      * [Auto Parallel] Improve the completion algorithms
      
      * [Auto Parallel] Fix bugs for ernie inference
      
      * [Auto Parallel] Remove attrs which cannot be pickled
      
      * [Auto Parallel] make the dims_mappings of LodTensorArray vars empty
      
      * [Auto Parallel] Fix bugs for the ernie inference in the pipeline parallel
      
      * [Auto Parallel] Remove unncessary comments
      
      * [Auto Parallel] Fix a bug of the CMakeLists
      
      * [Auto Parallel] Use the newest APIs to write the unit test
      
      * [Auto Parallel] Remove unnecessary statements
      ec6b8fbd
  19. 15 3月, 2022 2 次提交
  20. 14 3月, 2022 1 次提交
  21. 10 3月, 2022 1 次提交
  22. 07 3月, 2022 1 次提交
  23. 02 3月, 2022 1 次提交
  24. 24 2月, 2022 2 次提交
  25. 22 2月, 2022 2 次提交
  26. 18 2月, 2022 1 次提交