diff --git a/doc/design/distributed_lookup_table_design.md b/doc/design/distributed_lookup_table_design.md index d33502759350bbbc133f2a68f55f8817cd2316b7..a09f2818c888397b07fc7d09ecd20056f4176982 100644 --- a/doc/design/distributed_lookup_table_design.md +++ b/doc/design/distributed_lookup_table_design.md @@ -59,7 +59,9 @@ memcached, as the storage service, and we run the optimization algorithm on parameter servers of PaddlePaddle. The following figure illustrates the training process. -![Alt text](https://g.gravizo.com/svg? + + + Each trainer runs the forward and backward passes using their local data: