# Design Doc: Remote Parameter Updater for Cluster Train
For an overview of distribute training, please refer to [distributed training design doc](README.md). In this design doc, we will discuss the parameter updater that will use parameter server cclient [The Client Library of Parameter Server Design Doc](pserver_client.md) to manage and update parameters.
## Parameter Updater
Parameter Updater is used by trainer to manage and update parameter, there are mainly two kind of parameter updater: local and remote, since this design is for cluster train, we will only discuss remote parameter updater here.
### Remote Parameter Updater
Remote Parameter Updater manage parameters through remote parameter server with the client that communicate with pserver([The Client Library of Parameter Server Design Doc](pserver_client.md))
In PaddlePaddle Python V2 API, trainer is implemented in python, and the trainer will hold a instance of parameter updater and call it's functions directly. In this design, we will also expose the api of RemoteParameterUpdater to python with swig.
#### Sparse Remote Parameter Updater
Since we will only implement dense parameter management new, the mechanism for sparse parameter will be discussed in next stage.
<liclass="toctree-l2"><aclass="reference internal"href="../../getstarted/build_and_install/index_en.html">Install and Build</a><ul>
<liclass="toctree-l3"><aclass="reference internal"href="../../getstarted/build_and_install/docker_install_en.html">PaddlePaddle in Docker Containers</a></li>
<liclass="toctree-l2"><aclass="reference internal"href="../../howto/usage/k8s/k8s_en.html">Paddle On Kubernetes</a></li>
<liclass="toctree-l2"><aclass="reference internal"href="../../howto/usage/k8s/k8s_aws_en.html">Distributed PaddlePaddle Training on AWS with Kubernetes</a></li>
<liclass="toctree-l2"><aclass="reference internal"href="../../howto/dev/new_layer_en.html">Write New Layers</a></li>
<spanid="design-doc-remote-parameter-updater-for-cluster-train"></span><h1>Design Doc: Remote Parameter Updater for Cluster Train<aclass="headerlink"href="#design-doc-remote-parameter-updater-for-cluster-train"title="Permalink to this headline">¶</a></h1>
<p>For an overview of distribute training, please refer to <aclass="reference internal"href="README.html"><spanclass="doc">distributed training design doc</span></a>. In this design doc, we will discuss the parameter updater that will use parameter server cclient <aclass="reference internal"href="pserver_client.html"><spanclass="doc">The Client Library of Parameter Server Design Doc</span></a> to manage and update parameters.</p>
<divclass="section"id="parameter-updater">
<spanid="parameter-updater"></span><h2>Parameter Updater<aclass="headerlink"href="#parameter-updater"title="Permalink to this headline">¶</a></h2>
<p>Parameter Updater is used by trainer to manage and update parameter, there are mainly two kind of parameter updater: local and remote, since this design is for cluster train, we will only discuss remote parameter updater here.</p>
<divclass="section"id="remote-parameter-updater">
<spanid="remote-parameter-updater"></span><h3>Remote Parameter Updater<aclass="headerlink"href="#remote-parameter-updater"title="Permalink to this headline">¶</a></h3>
<p>Remote Parameter Updater manage parameters through remote parameter server with the client that communicate with pserver(<aclass="reference internal"href="pserver_client.html"><spanclass="doc">The Client Library of Parameter Server Design Doc</span></a>)</p>
<p>In PaddlePaddle Python V2 API, trainer is implemented in python, and the trainer will hold a instance of parameter updater and call it’s functions directly. In this design, we will also expose the api of RemoteParameterUpdater to python with swig.</p>
<spanid="sparse-remote-parameter-updater"></span><h4>Sparse Remote Parameter Updater<aclass="headerlink"href="#sparse-remote-parameter-updater"title="Permalink to this headline">¶</a></h4>
<p>Since we will only implement dense parameter management new, the mechanism for sparse parameter will be discussed in next stage.</p>
</div>
</div>
<divclass="section"id="interface-design">
<spanid="interface-design"></span><h3>Interface Design<aclass="headerlink"href="#interface-design"title="Permalink to this headline">¶</a></h3>
Built with <ahref="http://sphinx-doc.org/">Sphinx</a> using a <ahref="https://github.com/snide/sphinx_rtd_theme">theme</a> provided by <ahref="https://readthedocs.org">Read the Docs</a>.
# Design Doc: Remote Parameter Updater for Cluster Train
For an overview of distribute training, please refer to [distributed training design doc](README.md). In this design doc, we will discuss the parameter updater that will use parameter server cclient [The Client Library of Parameter Server Design Doc](pserver_client.md) to manage and update parameters.
## Parameter Updater
Parameter Updater is used by trainer to manage and update parameter, there are mainly two kind of parameter updater: local and remote, since this design is for cluster train, we will only discuss remote parameter updater here.
### Remote Parameter Updater
Remote Parameter Updater manage parameters through remote parameter server with the client that communicate with pserver([The Client Library of Parameter Server Design Doc](pserver_client.md))
In PaddlePaddle Python V2 API, trainer is implemented in python, and the trainer will hold a instance of parameter updater and call it's functions directly. In this design, we will also expose the api of RemoteParameterUpdater to python with swig.
#### Sparse Remote Parameter Updater
Since we will only implement dense parameter management new, the mechanism for sparse parameter will be discussed in next stage.
<spanid="design-doc-remote-parameter-updater-for-cluster-train"></span><h1>Design Doc: Remote Parameter Updater for Cluster Train<aclass="headerlink"href="#design-doc-remote-parameter-updater-for-cluster-train"title="永久链接至标题">¶</a></h1>
<p>For an overview of distribute training, please refer to <aclass="reference internal"href="README.html"><spanclass="doc">distributed training design doc</span></a>. In this design doc, we will discuss the parameter updater that will use parameter server cclient <aclass="reference internal"href="pserver_client.html"><spanclass="doc">The Client Library of Parameter Server Design Doc</span></a> to manage and update parameters.</p>
<p>Parameter Updater is used by trainer to manage and update parameter, there are mainly two kind of parameter updater: local and remote, since this design is for cluster train, we will only discuss remote parameter updater here.</p>
<p>Remote Parameter Updater manage parameters through remote parameter server with the client that communicate with pserver(<aclass="reference internal"href="pserver_client.html"><spanclass="doc">The Client Library of Parameter Server Design Doc</span></a>)</p>
<p>In PaddlePaddle Python V2 API, trainer is implemented in python, and the trainer will hold a instance of parameter updater and call it’s functions directly. In this design, we will also expose the api of RemoteParameterUpdater to python with swig.</p>
Built with <ahref="http://sphinx-doc.org/">Sphinx</a> using a <ahref="https://github.com/snide/sphinx_rtd_theme">theme</a> provided by <ahref="https://readthedocs.org">Read the Docs</a>.