Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
机器未来
Paddle
提交
0064fcb0
P
Paddle
项目概览
机器未来
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
1
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
1
Issue
1
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
0064fcb0
编写于
5月 10, 2017
作者:
H
Helin Wang
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
update C API
上级
66a3cfe3
变更
1
隐藏空白更改
内联
并排
Showing
1 changed file
with
34 addition
and
15 deletion
+34
-15
doc/design/cluster_train/pserver_client.md
doc/design/cluster_train/pserver_client.md
+34
-15
未找到文件。
doc/design/cluster_train/pserver_client.md
浏览文件 @
0064fcb0
...
...
@@ -12,6 +12,13 @@ For an overview of trainer's role, please refer to [distributed training design
#define PADDLE_ELEMENT_TYPE_FLOAT32 4
#define PADDLE_ELEMENT_TYPE_FLOAT64 5
typedef
struct
{
char
*
name
;
int
element_type
;
void
*
content
;
int
content_len
;
}
paddle_parameter
,
paddle_gradient
;
typedef
struct
paddle_pserver_client
paddle_pserver_client
;
/**
...
...
@@ -27,33 +34,36 @@ paddle_pserver_client* paddle_new_pserver_client();
void
paddle_pserver_client_release
(
paddle_pserver_client
*
client
);
/**
* @brief paddle_begin_init_param begins to initialize parameters
* @brief paddle_begin_init_param
s
begins to initialize parameters
* on parameter servers.
*
* paddle_begin_init_param will be called from multiple trainers, only
* paddle_begin_init_param
s
will be called from multiple trainers, only
* one trainer will be selected to initialize the parameters on
* parameter servers. Other trainers will be blocked until the
* initialization is done, and they need to get the initialized
* parameters from parameter servers using @paddle_get_param.
* parameters from parameter servers using @paddle_get_param
s
.
*
* @return 1 if trainer is selected to initialize parameter
* servers, otherwise 0.
* @param config_proto serialized parameter server configuration
* protobuffer.
* @return 1 if trainer is selected to initialize parameter servers,
* otherwise 0.
*/
int
paddle_begin_init_param
(
paddle_pserver_client
*
client
);
int
paddle_begin_init_param
s
(
paddle_pserver_client
*
client
,
const
char
*
config_proto
);
/**
* @brief paddle_init_param initializes the parameter on parameter
* servers.
*
* @param param the parameter to initialize.
* @return 0 if successful, otherwise -1. On failure the trainer need
* to restart the entire initialization process starting from
* paddle_begin_init_param. Or simply exit the program and wait for
* cluster management system to restart trainer.
*/
int
paddle_init_param
(
paddle_pserver_client
*
client
,
const
char
*
name
,
int
element_type
,
const
void
*
content
);
int
paddle_init_param
(
paddle_pserver_client
*
client
,
paddle_parameter
params
);
/**
* @brief paddle_finish_init_param tells parameter servers client has
* @brief paddle_finish_init_param
s
tells parameter servers client has
* sent all parameters to parameter servers as initialization.
*
* @return 0 if successful, otherwise -1. On failure the trainer need
...
...
@@ -61,34 +71,43 @@ int paddle_init_param(paddle_pserver_client* client, const char* name, int eleme
* paddle_begin_init_param. Or simply exit the program and wait for
* cluster management system to restart trainer.
*/
int
paddle_finish_init_param
(
paddle_pserver_client
*
client
);
int
paddle_finish_init_param
s
(
paddle_pserver_client
*
client
);
/**
* @brief paddle_send_grad sends gradients to parameter servers for
* @brief paddle_send_grad
s
sends gradients to parameter servers for
* updating parameters.
*
* @param grads the array of gradients to send.
* @param total the total number of gradient inside the gradient array.
* @param learning_rate the learning rate for the gradients.
* @return 0 if successful, otherwise -1.
*/
int
paddle_send_grad
(
paddle_pserver_client
*
client
,
const
char
*
name
,
int
element_type
,
const
void
*
content
);
int
paddle_send_grad
s
(
paddle_pserver_client
*
client
,
const
paddle_gradient
*
grads
,
int
total
,
double
learning_rate
);
/**
* @brief paddle_set_param
sets a parameter on
parameter servers.
* @brief paddle_set_param
s sets parameters to
parameter servers.
*
* @param params the array of parameters to set to parameter servers.
* @param total number of parameters inside the parameter array.
* @return 0 if successful, otherwise -1.
*/
int
paddle_set_param
(
paddle_pserver_client
*
client
,
const
char
*
name
,
int
element_type
,
const
void
*
content
);
int
paddle_set_param
s
(
paddle_pserver_client
*
client
,
const
paddle_parameter
*
params
,
int
total
);
/**
* @brief paddle_get_param
gets the parameter
from parameter servers.
* @brief paddle_get_param
s gets parameters
from parameter servers.
*
* @param names the array of names of the parameters to get.
* @param dst the destination array of parameters to save to.
* @param total the total number of parameters to get.
* @return 0 if successful, otherwise -1.
*/
int
paddle_get_param
(
paddle_pserver_client
*
client
,
const
char
*
name
,
void
**
dst
,
int
*
dstLen
);
int
paddle_get_param
s
(
paddle_pserver_client
*
client
,
const
char
**
names
,
paddle_parameter
*
dst
,
int
total
);
/**
* @brief paddle_save_model indicates parameters to save the parameter
* to the given path
*
* @param path the path to save parameters.
* @return 0 if successful, otherwise -1.
*/
int
paddle_save_model
(
paddle_pserver_client
*
client
,
const
char
*
path
);
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录