diff --git a/doc/design/distributed_lookup_table_design.md b/doc/design/distributed_lookup_table_design.md index 92f2e8f848aff8cface6a8dd8a1ae938ece72efa..d33502759350bbbc133f2a68f55f8817cd2316b7 100644 --- a/doc/design/distributed_lookup_table_design.md +++ b/doc/design/distributed_lookup_table_design.md @@ -59,7 +59,7 @@ memcached, as the storage service, and we run the optimization algorithm on parameter servers of PaddlePaddle. The following figure illustrates the training process. - +) Each trainer runs the forward and backward passes using their local data: