From e18f7de93524303e41888cc099020831f74a4092 Mon Sep 17 00:00:00 2001 From: qiaolongfei Date: Fri, 6 Jul 2018 13:39:30 +0800 Subject: [PATCH] typo --- doc/fluid/design/dist_train/distributed_lookup_table_design.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/fluid/design/dist_train/distributed_lookup_table_design.md b/doc/fluid/design/dist_train/distributed_lookup_table_design.md index 20ed6f31d..3d2e9ef19 100644 --- a/doc/fluid/design/dist_train/distributed_lookup_table_design.md +++ b/doc/fluid/design/dist_train/distributed_lookup_table_design.md @@ -67,7 +67,7 @@ operator: ![lookup table training](./src/lookup_table_training.png) At the beginning of training, paddle only malloc the memory for the lookup table at parameter server side, the id and it's value will not be initialized. During training, when a parameter server received an Id, if it is already in the lookup table, it will return the existing parameter, if the id does not exist, paddle will add it into the lookup table and initialize the value for it. -### Problem3: parameter load and save +### Problem 3: parameter load and save For common parameters, paddle use trainer to save and load them. But for distribute lookup table, trainer can not do this because it's large size. -- GitLab