diff --git a/develop/doc/_sources/design/refactor/distributed_architecture.md.txt b/develop/doc/_sources/design/refactor/distributed_architecture.md.txt index 2b4f921ae93c3b443ed62a28b1fa9fbda14f73ab..d9fe7d6bbb0eeb73fcdca3ee749a4f10bcdda682 100644 --- a/develop/doc/_sources/design/refactor/distributed_architecture.md.txt +++ b/develop/doc/_sources/design/refactor/distributed_architecture.md.txt @@ -53,7 +53,7 @@ The IR for PaddlePaddle after refactoring is called a `Block`, it specifies the The user can not directly specify the parameter update rule for the parameter server in the Python module, since the parameter server does not use the same computation definition as the trainer. Instead, the update rule is baked inside the parameter server. The user can not specify the update rule explicitly. This could be fixed by making the parameter server run the same computation definition as the trainer (the user's Python module). For a detailed explanation, refer to this document - -[Design Doc: Operation Graph Based Parameter Server](./dist_train.md) +[Design Doc: Operation Graph Based Parameter Server](./parameter_server.md) ## Distributed Training Architecture diff --git a/develop/doc/design/refactor/distributed_architecture.html b/develop/doc/design/refactor/distributed_architecture.html index 6533e3ca23b80764455576d4e7f08ae6ff0b2ec2..2522d87b98892b96ef2949da22ad7c5cae105da9 100644 --- a/develop/doc/design/refactor/distributed_architecture.html +++ b/develop/doc/design/refactor/distributed_architecture.html @@ -246,7 +246,7 @@ computation is only specified in Python code which sits outside of PaddlePaddle,
The user can not directly specify the parameter update rule for the parameter server in the Python module, since the parameter server does not use the same computation definition as the trainer. Instead, the update rule is baked inside the parameter server. The user can not specify the update rule explicitly.
This could be fixed by making the parameter server run the same computation definition as the trainer (the user’s Python module). For a detailed explanation, refer to this document - -Design Doc: Operation Graph Based Parameter Server
+Design Doc: Operation Graph Based Parameter ServerThe user can not directly specify the parameter update rule for the parameter server in the Python module, since the parameter server does not use the same computation definition as the trainer. Instead, the update rule is baked inside the parameter server. The user can not specify the update rule explicitly.
This could be fixed by making the parameter server run the same computation definition as the trainer (the user’s Python module). For a detailed explanation, refer to this document - -Design Doc: Operation Graph Based Parameter Server
+Design Doc: Operation Graph Based Parameter Server