提交 e75fd183 编写于 作者: H helinwang 提交者: GitHub

Merge pull request #390 from helinwang/doc

Update link in mnist-client/README.md
...@@ -16,7 +16,7 @@ this ...@@ -16,7 +16,7 @@ this
### Start the Inference Server ### Start the Inference Server
The inference server can be used to inference any model trained by The inference server can be used to inference any model trained by
PaddlePaddle. Please see [here](TODO) for more details. PaddlePaddle. Please see [here](../serve/README.md) for more details.
1. Download the MNIST inference model topylogy and parameters to the 1. Download the MNIST inference model topylogy and parameters to the
current working directory. current working directory.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册