- 05 8月, 2020 1 次提交
- 
- 
由 WangXi 提交于Add dgc to fleet meta optimizer, rm dgc from optimizer all 
 
- 
- 03 8月, 2020 1 次提交
- 
- 
由 Dong Daxiang 提交于* split meta optimizer files * add graph execution in execution, update two properties in DistributedStrategy, unit tests for these features 
 
- 
- 30 7月, 2020 1 次提交
- 
- 
由 mapingshuo 提交于* add gradient Merge optimizer to meta, test=develop 
 
- 
- 29 7月, 2020 1 次提交
- 
- 
由 Dong Daxiang 提交于* refine strategy compiler and meta optimizers make async as a_sync 
 
- 
- 28 7月, 2020 1 次提交
- 
- 
由 Dong Daxiang 提交于* add more settings for distributed strategy Basically, DistributedStrategy has several parts of configurations: - BuildStrategy: the same as paddle.fluid.BuildStrategy, but the distributed arguments are moved out of BuildStrategy - ExecutionStrategy: the same as paddle.fluid.ExecutionStrategy - collective communication configs: nccl_comm_num, hierarchical allreduce and so on - distributed algorithms: async_update(mainly used in PS), lars, lamb and so on 
 
- 
- 20 7月, 2020 1 次提交
- 
- 
由 Dong Daxiang 提交于refactor fleet api under paddle.fleet update DistributedStrategy 
 
- 
- 08 7月, 2020 1 次提交
- 
- 
由 Dong Daxiang 提交于test=develop 
 
- 
- 06 7月, 2020 1 次提交
- 
- 
由 Dong Daxiang 提交于* add paddle.fleet.DistributedStrategy for 2.0 
 
- 
