- 11 5月, 2021 1 次提交
-
-
由 ShenLiang 提交于
fix error log for reducer fix doc fix bug of utest fix spawn fix converage
-
- 25 4月, 2021 1 次提交
-
-
由 lilong12 提交于
* update
-
- 20 4月, 2021 1 次提交
-
-
由 JZ-LIANG 提交于
* sharding: update config DOC * update pipeline config * sharding update doc
-
- 17 4月, 2021 1 次提交
-
-
由 ShenLiang 提交于
* add model parallel support in dygraph
-
- 01 4月, 2021 1 次提交
-
-
由 ShenLiang 提交于
* support control flow * supoort sync_parameters_buffers * fix the bug of sparse embedding
-
- 24 2月, 2021 1 次提交
-
-
由 lilong12 提交于
* update, test=develop
-
- 01 2月, 2021 1 次提交
-
-
由 WangXi 提交于
-
- 12 1月, 2021 1 次提交
-
-
由 JZ-LIANG 提交于
-
- 09 12月, 2020 1 次提交
-
-
由 ShenLiang 提交于
* add tensor_indices in AssignGroupBySize * add rebuild group in reducer
-
- 01 12月, 2020 2 次提交
- 26 11月, 2020 1 次提交
-
-
由 JZ-LIANG 提交于
* add lars to fleet meta optimizer * add lamb to proto * add lamb to fleet meta optimizer * fixed syntax bug * fixed syntax bug * fixed syntax error in lamb, add config setter of lamb in distributed_strategy * trigger unitest to rerun * add new unitest func for lamb * revise unitest for lars and lamb * revise dgc meta unitest * revise lars document in distribute_strategy * revise lars lamb document in distributed_strategy.py * revise lars lamb document in distributed_strategy.py * add weight decay exclude logic to lars * restore optimzier.py * restore optimizer.py as develop except lars * add epsilon and exclude fn to distributed_sttrategy * add lars epsilon * revise unitest for fleet lars and lamb * revise lars lamb unitest for CI coverage * revise lars argument api * revise lars argument api * revise lars argument api * revise api doc of lars * fix op role * add sharding save and add_sync_comm_for_test function * add comm_analyse to utlis * revise sharding_utils * add sharding saving unittest * revise sharding utils for unittest * revise sharding en doc * update sharding utils api * add doc for sharding * fixed bug in sharding var size count * update varsize count in sharding * fix sharding num_nccl_comm * Revert "fix sharding num_nccl_comm" This reverts commit d51587c15e9323acf226ddd36154275f0d1daf76.
-
- 24 11月, 2020 1 次提交
-
-
由 Leo Chen 提交于
* upgrade comment string to raw string * fix string in * fix string with ' ' * revert update on comments * upgrade only necessary * fix sample code checker * fix comments with '''
-
- 26 10月, 2020 1 次提交
-
-
由 mapingshuo 提交于
* add sharding
-
- 22 10月, 2020 1 次提交
-
-
由 WangXi 提交于
-
- 12 10月, 2020 1 次提交
-
-
由 WangXi 提交于
-
- 28 9月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* add get final strategy for user to print final strategy
-
- 25 9月, 2020 1 次提交
-
-
由 WangXi 提交于
-
- 16 9月, 2020 1 次提交
-
-
由 ShenLiang 提交于
* add adaptivelsgd * Todo fix the code to avoid the conflict.
-
- 14 9月, 2020 1 次提交
-
-
由 ShenLiang 提交于
* rm auto from localsgd
-
- 09 9月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* refine launch and distributed repr string for print
-
- 07 9月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* add auto parallel L1 implementation test=develop
-
- 04 9月, 2020 1 次提交
-
-
由 mapingshuo 提交于
* fix doc * test=develop * test=develop * test=develop * test=develop * test=develop * test=develop * test=develop * test=develop * fix doc, test=develop * update localsgd doc test=develop * update localsgd doc test=develop * fix fleet dgc amp doc, test=develop * fix, test=develop * fix async configs Co-authored-by: Nliuyi05 <gavin1332@gmail.com> Co-authored-by: NWangXi <wangxi16@baidu.com> Co-authored-by: NseiriosPlus <tangwei12@baidu.com>
-
- 29 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* fix api document
-
- 27 8月, 2020 1 次提交
-
-
由 Yi Liu 提交于
* modified timeout value on windows and mac (#26690) * add Local SGD algorithm referenced paper test=develop
-
- 26 8月, 2020 1 次提交
-
-
由 JZ-LIANG 提交于
-
- 25 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* add cudnn related strategies to DistributedStrategy
-
- 24 8月, 2020 1 次提交
-
-
由 WangXi 提交于
-
- 21 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* add documentation for DistributedStrategy
-
- 18 8月, 2020 1 次提交
-
-
由 mapingshuo 提交于
* add feature to fleet2.0 role_maker, distribute_strategy, test=develop
-
- 13 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* move paddle.fleet to paddle.distributed.fleet
-
- 12 8月, 2020 1 次提交
-
-
由 JZ-LIANG 提交于
add lamb to fleet meta optimizer
-
- 10 8月, 2020 1 次提交
-
-
由 tangwei12 提交于
* add paddle.fleet.AsyncOptimizer Co-authored-by: Ndongdaxiang <dongdaxiang@baidu.com>
-
- 05 8月, 2020 1 次提交
-
-
由 WangXi 提交于
Add dgc to fleet meta optimizer, rm dgc from optimizer all
-
- 03 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* split meta optimizer files * add graph execution in execution, update two properties in DistributedStrategy, unit tests for these features
-
- 30 7月, 2020 1 次提交
-
-
由 mapingshuo 提交于
* add gradient Merge optimizer to meta, test=develop
-
- 29 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* refine strategy compiler and meta optimizers make async as a_sync
-
- 28 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* add more settings for distributed strategy Basically, DistributedStrategy has several parts of configurations: - BuildStrategy: the same as paddle.fluid.BuildStrategy, but the distributed arguments are moved out of BuildStrategy - ExecutionStrategy: the same as paddle.fluid.ExecutionStrategy - collective communication configs: nccl_comm_num, hierarchical allreduce and so on - distributed algorithms: async_update(mainly used in PS), lars, lamb and so on
-
- 20 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
refactor fleet api under paddle.fleet update DistributedStrategy
-
- 08 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
test=develop
-