“79cdb9c275d18b9cb69a02fb8ba1f3583c879073”上不存在“git@gitcode.net:taosdata/tdengine.git”
- 16 9月, 2020 1 次提交
-
-
由 ShenLiang 提交于
* add adaptivelsgd * Todo fix the code to avoid the conflict.
-
- 14 9月, 2020 1 次提交
-
-
由 ShenLiang 提交于
* rm auto from localsgd
-
- 09 9月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* refine launch and distributed repr string for print
-
- 25 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* add cudnn related strategies to DistributedStrategy
-
- 18 8月, 2020 1 次提交
-
-
由 mapingshuo 提交于
* add feature to fleet2.0 role_maker, distribute_strategy, test=develop
-
- 13 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* move paddle.fleet to paddle.distributed.fleet
-
- 10 8月, 2020 1 次提交
-
-
由 tangwei12 提交于
* add paddle.fleet.AsyncOptimizer Co-authored-by: Ndongdaxiang <dongdaxiang@baidu.com>
-
- 03 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* split meta optimizer files * add graph execution in execution, update two properties in DistributedStrategy, unit tests for these features
-
- 29 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* refine strategy compiler and meta optimizers make async as a_sync
-
- 28 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* add more settings for distributed strategy Basically, DistributedStrategy has several parts of configurations: - BuildStrategy: the same as paddle.fluid.BuildStrategy, but the distributed arguments are moved out of BuildStrategy - ExecutionStrategy: the same as paddle.fluid.ExecutionStrategy - collective communication configs: nccl_comm_num, hierarchical allreduce and so on - distributed algorithms: async_update(mainly used in PS), lars, lamb and so on
-
- 20 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
refactor fleet api under paddle.fleet update DistributedStrategy
-
- 08 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
test=develop
-
- 06 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* add paddle.fleet.DistributedStrategy for 2.0
-