- 10 9月, 2020 1 次提交
-
-
由 123malin 提交于
* parameter_server_optimizer support auto_strategy
-
- 07 9月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* add auto parallel L1 implementation test=develop
-
- 21 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* consider the combination of different strategies to work together
-
- 18 8月, 2020 1 次提交
-
-
由 mapingshuo 提交于
* add feature to fleet2.0 role_maker, distribute_strategy, test=develop
-
- 13 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* move paddle.fleet to paddle.distributed.fleet
-
- 11 8月, 2020 1 次提交
-
-
由 pangyoki 提交于
* Directory migration, test=develop * Change imperative from paddle init to paddle framework, test=develop * Fixed jit bug, test=develop * default static mode, test=develop * fixed format and create parameter belongs to framework, test=develop * Fixed import package, test=develop * fix __init__ format, test=develop * fixed alias problem * fixed paddle.enable_imperative problems, test=develop * Add unittest * delete install_check comment * Fixed unittest timeout * fixed unittest error * move Program default_xx_program to static package * optimize unittest method * fixed framework __init__ format * fixed jit path * delete alias * move jit to paddle * Fixed unittest format * fixed paddle.default_main_program * Fixed save load API in paddle __init__.py * fixed ci paddle.imperative.to_variable
-
- 03 8月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* split meta optimizer files * add graph execution in execution, update two properties in DistributedStrategy, unit tests for these features
-
- 30 7月, 2020 1 次提交
-
-
由 tangwei12 提交于
Integrated Trainer of Parameter Server (API add `fluid.contrib.layers.sparse_embedding` only) (#22957) * Integrated Trainer of Parameter Server
-
- 29 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* refine strategy compiler and meta optimizers make async as a_sync
-
- 28 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* add more settings for distributed strategy Basically, DistributedStrategy has several parts of configurations: - BuildStrategy: the same as paddle.fluid.BuildStrategy, but the distributed arguments are moved out of BuildStrategy - ExecutionStrategy: the same as paddle.fluid.ExecutionStrategy - collective communication configs: nccl_comm_num, hierarchical allreduce and so on - distributed algorithms: async_update(mainly used in PS), lars, lamb and so on
-
- 23 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
* fix gen nccl id bug
-
- 20 7月, 2020 1 次提交
-
-
由 Dong Daxiang 提交于
refactor fleet api under paddle.fleet update DistributedStrategy
-