提交 8746e2f6 编写于 作者: H Helin Wang

update link

上级 d598fe25
...@@ -16,7 +16,7 @@ this ...@@ -16,7 +16,7 @@ this
### Start the Inference Server ### Start the Inference Server
The inference server can be used to inference any model trained by The inference server can be used to inference any model trained by
PaddlePaddle. Please see [here](TODO) for more details. PaddlePaddle. Please see [here](../serve/README.md) for more details.
1. Download the MNIST inference model topylogy and parameters to the 1. Download the MNIST inference model topylogy and parameters to the
current working directory. current working directory.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册