1. 03 12月, 2020 2 次提交
    • Z
      [Cherry-pick] Add pure fp16 training with master weights. (#29301) · d8ea8a06
      Zhen Wang 提交于
      * Add pure fp16 training with master weights. (#27712)
      
      * add the weight decay func for the momentum op
      
      * Add the multi_precision function in Momentum Optimizer.
      
      * Make sure that the initial value of master weights are same with the fp16 weights.
      
      * add static loss scaling.
      
      * add the rescale_grad function in the pure fp16 training.
      
      * use the original momentum updating method.
      
      * Polish some codes, such as variable names.
      
      * add docstring for apis.
      
      * update the var creation details of _create_master_weight.
      
      * not modify codes about imperative momentum updating.
      
      * Fix the error of test_dist_sparse_tensor_load_momentum UT.
      
      * add unit test for multi precision fp16 training.
      
      * add more unit tests for CI.
      
      * Use lower threshold values for allclose comparing in test_multi_precision_fp16_train UT.
      d8ea8a06
    • H
      [Dy2stat] Fix PaddleGan Deoldify Model Dy2stat Problems (#29226) (#29281) · 32c139d3
      Huihuang Zheng 提交于
      Cherry-pick of PR #29226
      32c139d3
  2. 02 12月, 2020 2 次提交
  3. 01 12月, 2020 4 次提交
  4. 30 11月, 2020 11 次提交
  5. 28 11月, 2020 4 次提交
  6. 27 11月, 2020 12 次提交
  7. 26 11月, 2020 5 次提交
    • L
      add paddle.broadcast_to api which is a alias of paddle.expand (#28706) · 449903de
      lilong12 提交于
      * update, test=develop
      449903de
    • L
      Split train_mode and has_grad for tracer (#29064) · 770395cb
      Leo Chen 提交于
      * split train_mode and has_grad
      
      * fix format
      
      * fix ci problems
      
      * fix sample code
      770395cb
    • Y
      disable ut test_static_save_load (#29119) · 27d04a3b
      YUNSHEN XIE 提交于
      27d04a3b
    • J
      [sharding] doc, api, bug fixed (#28983) · 0dadacc4
      JZ-LIANG 提交于
      * add lars to fleet meta optimizer
      
      * add lamb to proto
      
      * add lamb to fleet meta optimizer
      
      * fixed syntax bug
      
      * fixed syntax bug
      
      * fixed syntax error in lamb, add config setter of lamb in distributed_strategy
      
      * trigger unitest to rerun
      
      * add new unitest func for lamb
      
      * revise unitest for lars and lamb
      
      * revise dgc meta unitest
      
      * revise lars document in distribute_strategy
      
      * revise lars lamb document in distributed_strategy.py
      
      * revise lars lamb document in distributed_strategy.py
      
      * add weight decay exclude logic to lars
      
      * restore optimzier.py
      
      * restore optimizer.py as develop except lars
      
      * add epsilon and exclude fn to distributed_sttrategy
      
      * add lars epsilon
      
      * revise unitest for fleet lars and lamb
      
      * revise lars lamb unitest for CI coverage
      
      * revise lars argument api
      
      * revise lars argument api
      
      * revise lars argument api
      
      * revise api doc of lars
      
      * fix op role
      
      * add sharding save and add_sync_comm_for_test function
      
      * add comm_analyse to utlis
      
      * revise sharding_utils
      
      * add sharding saving unittest
      
      * revise sharding utils for unittest
      
      * revise sharding en doc
      
      * update sharding utils api
      
      * add doc for sharding
      
      * fixed bug in sharding var size count
      
      * update varsize count in sharding
      
      * fix sharding num_nccl_comm
      
      * Revert "fix sharding num_nccl_comm"
      
      This reverts commit d51587c15e9323acf226ddd36154275f0d1daf76.
      0dadacc4
    • Y
      fix crypto ut test error for windows ci (#29090) · dd417750
      Yanghello 提交于
      dd417750