diff --git a/doc/fluid/design/dist_train/prefetch_parameter.md b/doc/fluid/design/dist_train/prefetch_parameter.md
index 952d2bada9238b0668225893d661c4856464b35a..b3e2210fdff26b06cb4fc5e6f6ffdef3ab8b20f8 100644
--- a/doc/fluid/design/dist_train/prefetch_parameter.md
+++ b/doc/fluid/design/dist_train/prefetch_parameter.md
@@ -16,13 +16,13 @@ Prior to reading this design, it would be useful for the reader to make themselv
 
 The execution of `lookup local table` is as follows:
 
-
+
 
 For some cases, the parameter(`weight`) may be very large, such as 10 billion features, the entire
 data could not be stored in one trainer's memory, so we need to partition this parameter and
 pre-fetch it at the beginning of each mini-batch, and we call it `lookup remote table`:
 
-
+
 
 The processing flow of `lookup remote table` is as follows: