# Paddle Serving
An easy-to-use Machine Learning Model Inference Service Deployment Tool
[![Release](https://img.shields.io/badge/Release-0.0.3-yellowgreen)](Release)
[![Issues](https://img.shields.io/github/issues/PaddlePaddle/Serving)](Issues)
[![License](https://img.shields.io/github/license/PaddlePaddle/Serving)](LICENSE)
[中文](./doc/README_CN.md)
## Motivation
Paddle Serving helps deep learning developers deploy an online inference service without much effort. Currently, Paddle Serving supports the deep learning models trained by [Paddle](https://github.com/PaddlePaddle/Paddle) althought it can be very easy to integrate other deep learning framework's model inference engine.
## Key Features
- Integrate with Paddle training pipeline seemlessly, most paddle models can be deployed with one line command
.
- **Industrial serving features** supported, such as models management, online loading, online A/B testing etc.
- **Distributed Key-Value indexing** supported that is especially useful for large scale sparse features as model inputs.
- **Highly concurrent and efficient communication** between clients and servers.
- **Multiple programming language** supported on client side, such as Golang, C++ and python
## Installation
```shell
pip install paddle-serving-client
pip install paddle-serving-server
```
## Quick Start Example
``` shell
wget --no-check-certificate https://paddle-serving.bj.bcebos.com/uci_housing.tar.gz
tar -xzf uci_housing.tar.gz
python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9292
```
### Python Client Request
```
from paddle_serving_client import Client
client = Client()
client.load_client_config("uci_housing_client")
client.connect(["127.0.0.1:9292"])
data = [0.0137, -0.1136, 0.2553, -0.0692, 0.0582, -0.0727,
-0.1583 -0.0584 0.6283 0.4919 0.1856 0.0795 -0.0332]
fetch_map = client.predict(feed={"x": data[0][0]}, fetch=["price"])
print(fetch_map)
```
### Document
[Design Doc(Chinese)](doc/DESIGN.md)
[How to config Serving native operators on server side?](doc/SERVER_OP.md)
[How to develop a new Serving operator](doc/OPERATOR.md)
[Client API for other programming languages](doc/CLIENT_API.md)
[FAQ(Chinese)](doc/FAQ.md)
### Advanced features and development
[Compile from source code(Chinese)](doc/COMPILE.md)
## Contribution
If you want to contribute code to Paddle Serving, please reference [Contribution Guidelines](doc/CONTRIBUTE.md)