提交 2ac3093f 编写于 作者: D Dong Daxiang 提交者: GitHub

Update README.md

上级 ad92ed5b
...@@ -11,7 +11,7 @@ An easy-to-use Machine Learning Model Inference Service Deployment Tool ...@@ -11,7 +11,7 @@ An easy-to-use Machine Learning Model Inference Service Deployment Tool
Paddle Serving helps deep learning developers deploy an online inference service without much effort. Currently, Paddle Serving supports the deep learning models trained by [Paddle](https://github.com/PaddlePaddle/Paddle) althought it can be very easy to integrate other deep learning framework's model inference engine. Paddle Serving helps deep learning developers deploy an online inference service without much effort. Currently, Paddle Serving supports the deep learning models trained by [Paddle](https://github.com/PaddlePaddle/Paddle) althought it can be very easy to integrate other deep learning framework's model inference engine.
## Key Features ## Key Features
- Integrate with Paddle training pipeline seemlessly, most paddle models can be deployed with **one line command**. - Integrate with Paddle training pipeline seemlessly, most paddle models can be deployed with <font color="#0000dd">one line command</font><br />.
- **Industrial serving features** supported, such as models management, online loading, online A/B testing etc. - **Industrial serving features** supported, such as models management, online loading, online A/B testing etc.
- **Distributed Key-Value indexing** supported that is especially useful for large scale sparse features as model inputs. - **Distributed Key-Value indexing** supported that is especially useful for large scale sparse features as model inputs.
- **Highly concurrent and efficient communication** between clients and servers. - **Highly concurrent and efficient communication** between clients and servers.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册