README.md 3.0 KB
Newer Older
D
Dong Daxiang 已提交
1
# Paddle Serving
D
Dong Daxiang 已提交
2
An easy-to-use Machine Learning Model Inference Service Deployment Tool
D
Dong Daxiang 已提交
3

D
Dong Daxiang 已提交
4
[![Release](https://img.shields.io/badge/Release-0.0.3-yellowgreen)](Release)
D
Dong Daxiang 已提交
5
[![Issues](https://img.shields.io/github/issues/PaddlePaddle/Serving)](Issues)
D
Dong Daxiang 已提交
6 7
[![License](https://img.shields.io/github/license/PaddlePaddle/Serving)](LICENSE)

D
Dong Daxiang 已提交
8
[中文](./doc/README_CN.md)
D
Dong Daxiang 已提交
9

D
Dong Daxiang 已提交
10
Paddle Serving helps deep learning developers deploy an online inference service without much effort. Currently, Paddle Serving supports the deep learning models produced by [Paddle](https://github.com/PaddlePaddle/Paddle) althought it can be very easy to support other deep learning framework's model inference engine.  
D
Dong Daxiang 已提交
11

D
Dong Daxiang 已提交
12 13 14 15 16 17
## Key Features
- Integrate with Paddle training pipeline seemlessly, most paddle models can be deployed with one line command.
- Industrial serving features supported, such as multiple models management, model online loading, online A/B testing etc.
- Distributed Key-Value indexing supported that is especially useful for large scale sparse features as model inputs.
- Highly concurrent and efficient communication, with [Baidu-rpc](https://github.com/apache/incubator-brpc) supported.
- Multiple programming language supported on client side, such as Golang, C++ and python
D
Dong Daxiang 已提交
18 19 20 21 22 23 24 25 26 27 28 29

## Quick Start

Paddle Serving supports light-weighted Python API for model inference and can be integrated with trainining process seemlessly. Here is a Boston House Pricing example for users to do quick start.

### Installation

```shell
pip install paddle-serving-client
pip install paddle-serving-server
```

D
Dong Daxiang 已提交
30
### Save Model with Paddle
D
Dong Daxiang 已提交
31
``` python
D
Dong Daxiang 已提交
32 33 34
# training scripts for uci housing price regression
# after training, save model with serving client API
# we are working on integrate this API with Paddle API
D
Dong Daxiang 已提交
35 36
import paddle_serving_client.io as serving_io
serving_io.save_model(
D
Dong Daxiang 已提交
37 38
    "uci_housing_model", "uci_housing_client",
    {"x": x}, {"price": y_predict}, fluid.default_main_program())
D
Dong Daxiang 已提交
39 40
```

D
Dong Daxiang 已提交
41
### Server side launch command
D
Dong Daxiang 已提交
42

D
Dong Daxiang 已提交
43 44
``` shell
python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9292
D
Dong Daxiang 已提交
45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72
```

### Client Side Scripts

```
from paddle_serving_client import Client
import paddle
import sys

client = Client()
client.load_client_config(sys.argv[1])
client.connect(["127.0.0.1:9292"])

test_reader = paddle.batch(paddle.reader.shuffle(
    paddle.dataset.uci_housing.test(), buf_size=500), batch_size=1)

for data in test_reader():
    fetch_map = client.predict(feed={"x": data[0][0]}, fetch=["y"])
    print("{} {}".format(fetch_map["y"][0], data[0][1][0]))

```



### Document

[Design Doc(Chinese)](doc/DESIGN.md)

D
Dong Daxiang 已提交
73 74 75 76 77 78
[How to config Serving native operators on server side?](doc/SERVER_OP.md)

[How to develop a new Serving operator](doc/OPERATOR.md)

[Client API for other programming languages](doc/CLIENT_API.md)

D
Dong Daxiang 已提交
79 80 81 82
[FAQ(Chinese)](doc/FAQ.md)

### Advanced features and development

D
Dong Daxiang 已提交
83
[Compile from source code(Chinese)](doc/COMPILE.md)
D
Dong Daxiang 已提交
84 85 86

## Contribution

D
Dong Daxiang 已提交
87
If you want to contribute code to Paddle Serving, please reference [Contribution Guidelines](doc/CONTRIBUTE.md)