- 22 3月, 2021 14 次提交
-
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
由 sandyhouse 提交于
-
- 18 3月, 2021 1 次提交
-
- 20 2月, 2021 1 次提交
-
-
由 gongweibao 提交于
Fix reshape on GE graph
-
- 08 2月, 2021 1 次提交
-
-
由 gongweibao 提交于
Destroy session first.
-
- 01 2月, 2021 2 次提交
-
-
由 gongweibao 提交于
Add paddle ascend distribution training supported
-
由 OleNet 提交于
Ascendrc add converted op : [range/equal/range/uniform_random/expand/squeeze], fix cast op bug (#30797) Ascendrc add converted op : [range/equal/range/uniform_random/expand/squeeze], fix cast op bug
-
- 29 1月, 2021 2 次提交
-
-
由 dingsiyu 提交于
Merge ascend_optimizer and ascend_parser.
-
由 gongweibao 提交于
code style
-
- 25 1月, 2021 1 次提交
-
-
由 Void Main 提交于
[Feature] Build parser to support distributed training
-
- 21 1月, 2021 3 次提交
-
-
由 gongweibao 提交于
Pass device_ids info from launch to trainer
-
由 Void Main 提交于
Build praser for Hcom* operators
-
由 gongweibao 提交于
Add distribution supported
-
- 15 1月, 2021 1 次提交
-
-
由 hutuxian 提交于
-
- 12 1月, 2021 2 次提交
-
-
由 JZ-LIANG 提交于
-
由 Chengmo 提交于
* add save tensor support Co-authored-by: NseiriosPlus <tangwei12@baidu.com>
-
- 08 1月, 2021 1 次提交
-
-
由 Chengmo 提交于
* add tensor table
-
- 05 1月, 2021 1 次提交
-
-
由 WangXi 提交于
-
- 25 12月, 2020 1 次提交
-
-
由 lilong12 提交于
* update, test=develop
-
- 24 12月, 2020 1 次提交
-
-
由 tangwei12 提交于
* oneps (3/4) Co-authored-by: NMrChengmo <cmchengmo@163.com> Co-authored-by: Nmalin10 <malin10@baidu.com> Co-authored-by: Nchengmo <chengmo@baidu.com>
-
- 17 12月, 2020 1 次提交
-
-
由 WangXi 提交于
-
- 11 12月, 2020 1 次提交
-
-
由 JZ-LIANG 提交于
* Sharding add hybrid-dp feature * update sharding in distributed_strategy * update sharding unitest * revise code format for sharding
-
- 30 11月, 2020 1 次提交
-
-
由 WangXi 提交于
-
- 26 11月, 2020 2 次提交
-
-
由 JZ-LIANG 提交于
* add lars to fleet meta optimizer * add lamb to proto * add lamb to fleet meta optimizer * fixed syntax bug * fixed syntax bug * fixed syntax error in lamb, add config setter of lamb in distributed_strategy * trigger unitest to rerun * add new unitest func for lamb * revise unitest for lars and lamb * revise dgc meta unitest * revise lars document in distribute_strategy * revise lars lamb document in distributed_strategy.py * revise lars lamb document in distributed_strategy.py * add weight decay exclude logic to lars * restore optimzier.py * restore optimizer.py as develop except lars * add epsilon and exclude fn to distributed_sttrategy * add lars epsilon * revise unitest for fleet lars and lamb * revise lars lamb unitest for CI coverage * revise lars argument api * revise lars argument api * revise lars argument api * revise api doc of lars * fix op role * add sharding save and add_sync_comm_for_test function * add comm_analyse to utlis * revise sharding_utils * add sharding saving unittest * revise sharding utils for unittest * revise sharding en doc * update sharding utils api * add doc for sharding * fixed bug in sharding var size count * update varsize count in sharding * fix sharding num_nccl_comm * Revert "fix sharding num_nccl_comm" This reverts commit d51587c15e9323acf226ddd36154275f0d1daf76.
-
由 WangXi 提交于
-
- 24 11月, 2020 1 次提交
-
-
由 Leo Chen 提交于
* upgrade comment string to raw string * fix string in * fix string with ' ' * revert update on comments * upgrade only necessary * fix sample code checker * fix comments with '''
-
- 23 11月, 2020 1 次提交
-
-
由 lilong12 提交于
* update, test=develop
-
- 18 11月, 2020 1 次提交
-
-
由 JZ-LIANG 提交于
* add lars to fleet meta optimizer * add lamb to proto * add lamb to fleet meta optimizer * fixed syntax bug * fixed syntax bug * fixed syntax error in lamb, add config setter of lamb in distributed_strategy * trigger unitest to rerun * add new unitest func for lamb * revise unitest for lars and lamb * revise dgc meta unitest * revise lars document in distribute_strategy * revise lars lamb document in distributed_strategy.py * revise lars lamb document in distributed_strategy.py * add weight decay exclude logic to lars * restore optimzier.py * restore optimizer.py as develop except lars * add epsilon and exclude fn to distributed_sttrategy * add lars epsilon * revise unitest for fleet lars and lamb * revise lars lamb unitest for CI coverage * revise lars argument api * revise lars argument api * revise lars argument api * revise api doc of lars * fix op role * add sharding save and add_sync_comm_for_test function * add comm_analyse to utlis * revise sharding_utils * add sharding saving unittest * revise sharding utils for unittest
-