README.md 13.4 KB
Newer Older
M
MRXLT 已提交
1 2
([简体中文](./README_CN.md)|English)

D
Dong Daxiang 已提交
3 4
<p align="center">
    <br>
D
Dong Daxiang 已提交
5
<img src='doc/serving_logo.png' width = "600" height = "130">
D
Dong Daxiang 已提交
6 7
    <br>
<p>
8

M
MRXLT 已提交
9

D
Dong Daxiang 已提交
10 11
<p align="center">
    <br>
B
barrierye 已提交
12 13 14
    <a href="https://travis-ci.com/PaddlePaddle/Serving">
        <img alt="Build Status" src="https://img.shields.io/travis/com/PaddlePaddle/Serving/develop">
    </a>
D
Dong Daxiang 已提交
15 16 17 18
    <img alt="Release" src="https://img.shields.io/badge/Release-0.0.3-yellowgreen">
    <img alt="Issues" src="https://img.shields.io/github/issues/PaddlePaddle/Serving">
    <img alt="License" src="https://img.shields.io/github/license/PaddlePaddle/Serving">
    <img alt="Slack" src="https://img.shields.io/badge/Join-Slack-green">
D
Dong Daxiang 已提交
19 20
    <br>
<p>
D
Dong Daxiang 已提交
21

D
Dong Daxiang 已提交
22
<h2 align="center">Motivation</h2>
D
Dong Daxiang 已提交
23

J
Jiawei Wang 已提交
24
We consider deploying deep learning inference service online to be a user-facing application in the future. **The goal of this project**: When you have trained a deep neural net with [Paddle](https://github.com/PaddlePaddle/Paddle), you are also capable to deploy the model online easily. A demo of Paddle Serving is as follows:
D
Dong Daxiang 已提交
25
<p align="center">
D
Dong Daxiang 已提交
26
    <img src="doc/demo.gif" width="700">
D
Dong Daxiang 已提交
27
</p>
D
Dong Daxiang 已提交
28 29


D
Dong Daxiang 已提交
30
<h2 align="center">Installation</h2>
D
Dong Daxiang 已提交
31

B
barrierye 已提交
32
We **highly recommend** you to **run Paddle Serving in Docker**, please visit [Run in Docker](https://github.com/PaddlePaddle/Serving/blob/develop/doc/RUN_IN_DOCKER.md)
M
MRXLT 已提交
33 34
```
# Run CPU Docker
B
barrierye 已提交
35 36
docker pull hub.baidubce.com/paddlepaddle/serving:latest
docker run -p 9292:9292 --name test -dit hub.baidubce.com/paddlepaddle/serving:latest
M
MRXLT 已提交
37 38 39 40
docker exec -it test bash
```
```
# Run GPU Docker
B
barrierye 已提交
41 42
nvidia-docker pull hub.baidubce.com/paddlepaddle/serving:latest-gpu
nvidia-docker run -p 9292:9292 --name test -dit hub.baidubce.com/paddlepaddle/serving:latest-gpu
M
MRXLT 已提交
43 44
nvidia-docker exec -it test bash
```
D
Dong Daxiang 已提交
45

D
Dong Daxiang 已提交
46
```shell
M
MRXLT 已提交
47 48 49
pip install paddle-serving-client 
pip install paddle-serving-server # CPU
pip install paddle-serving-server-gpu # GPU
D
Dong Daxiang 已提交
50 51
```

M
MRXLT 已提交
52
You may need to use a domestic mirror source (in China, you can use the Tsinghua mirror source, add `-i https://pypi.tuna.tsinghua.edu.cn/simple` to pip command) to speed up the download.
B
barrierye 已提交
53

M
MRXLT 已提交
54 55
If you need install modules compiled with develop branch, please download packages from [latest packages list](./doc/LATEST_PACKAGES.md) and install with `pip install` command.

M
MRXLT 已提交
56
Client package support Centos 7 and Ubuntu 18, or you can use HTTP service without install client.
57

D
Dong Daxiang 已提交
58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121

<h2 align="center"> Pre-built services with Paddle Serving</h2>

<h3 align="center">Chinese Word Segmentation</h4>

- **Description**: 
``` shell
Chinese word segmentation HTTP service that can be deployed with one line command.
```

- **Demo**: 
``` shell
> python -m paddle_serving_app.package -get lac
> tar -xzf lac.tar.gz
> python lac_web_service.py 9292 &
> curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"words": "我爱北京天安门"}], "fetch":["word_seg"]}' http://127.0.0.1:9393/lac/prediction
{"result":[{"word_seg":"我|爱|北京|天安门"}]}
```

<h3 align="center">Image Classification</h4>

- **Description**: 
``` shell
Image classification trained with Imagenet dataset. A label and corresponding probability will be returned.
Note: This demo needs paddle-serving-server-gpu. 
```

- **Download Servable Package**: 
``` shell
wget --no-check-certificate https://paddle-serving.bj.bcebos.com/imagenet-example/imagenet_demo.tar.gz
```
- **Host web service**: 
``` shell
tar -xzf imagenet_demo.tar.gz
python image_classification_service_demo.py resnet50_serving_model
```
- **Request sample**: 

<p align="center">
    <br>
<img src='https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg' width = "200" height = "200">
    <br>
<p>

``` shell
curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"url": "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg"}], "fetch": ["score"]}' http://127.0.0.1:9292/image/prediction
```
- **Request result**: 
``` shell
{"label":"daisy","prob":0.9341403245925903}
```



<h2 align="center">Some Key Features</h2>

- Integrate with Paddle training pipeline seamlessly, most paddle models can be deployed **with one line command**.
- **Industrial serving features** supported, such as models management, online loading, online A/B testing etc.
- **Distributed Key-Value indexing** supported which is especially useful for large scale sparse features as model inputs.
- **Highly concurrent and efficient communication** between clients and servers supported.
- **Multiple programming languages** supported on client side, such as Golang, C++ and python.
- **Extensible framework design** which can support model serving beyond Paddle.


D
Dong Daxiang 已提交
122
<h2 align="center">Quick Start Example</h2>
D
Dong Daxiang 已提交
123

D
Dong Daxiang 已提交
124
### Boston House Price Prediction model
D
Dong Daxiang 已提交
125
``` shell
D
Dong Daxiang 已提交
126
wget --no-check-certificate https://paddle-serving.bj.bcebos.com/uci_housing.tar.gz
D
Dong Daxiang 已提交
127
tar -xzf uci_housing.tar.gz
D
Dong Daxiang 已提交
128
```
D
Dong Daxiang 已提交
129

D
Dong Daxiang 已提交
130 131
Paddle Serving provides HTTP and RPC based service for users to access

D
Dong Daxiang 已提交
132
### HTTP service
D
Dong Daxiang 已提交
133

J
Jiawei Wang 已提交
134
Paddle Serving provides a built-in python module called `paddle_serving_server.serve` that can start a RPC service or a http service with one-line command. If we specify the argument `--name uci`, it means that we will have a HTTP service with a url of `$IP:$PORT/uci/prediction`
D
Dong Daxiang 已提交
135
``` shell
D
Dong Daxiang 已提交
136
python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9292 --name uci
D
Dong Daxiang 已提交
137
```
D
Dong Daxiang 已提交
138 139
<center>

D
Dong Daxiang 已提交
140 141
| Argument | Type | Default | Description |
|--------------|------|-----------|--------------------------------|
D
Dong Daxiang 已提交
142
| `thread` | int | `4` | Concurrency of current service |
D
Dong Daxiang 已提交
143
| `port` | int | `9292` | Exposed port of current service to users|
D
Dong Daxiang 已提交
144 145
| `name` | str | `""` | Service name, can be used to generate HTTP request url |
| `model` | str | `""` | Path of paddle model directory to be served |
M
MRXLT 已提交
146 147
| `mem_optim` | bool | `False` | Enable memory / graphic memory optimization |
| `ir_optim` | bool | `False` | Enable analysis and optimization of calculation graph |
M
MRXLT 已提交
148
| `use_mkl` (Only for cpu version) | bool | `False` | Run inference with MKL |
D
Dong Daxiang 已提交
149

D
Dong Daxiang 已提交
150
Here, we use `curl` to send a HTTP POST request to the service we just started. Users can use any python library to send HTTP POST as well, e.g, [requests](https://requests.readthedocs.io/en/master/).
D
Dong Daxiang 已提交
151 152
</center>

D
Dong Daxiang 已提交
153
``` shell
M
MRXLT 已提交
154
curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"x": [0.0137, -0.1136, 0.2553, -0.0692, 0.0582, -0.0727, -0.1583, -0.0584, 0.6283, 0.4919, 0.1856, 0.0795, -0.0332]}], "fetch":["price"]}' http://127.0.0.1:9292/uci/prediction
D
Dong Daxiang 已提交
155
```
D
Dong Daxiang 已提交
156

D
Dong Daxiang 已提交
157
### RPC service
D
Dong Daxiang 已提交
158

J
Jiawei Wang 已提交
159
A user can also start a RPC service with `paddle_serving_server.serve`. RPC service is usually faster than HTTP service, although a user needs to do some coding based on Paddle Serving's python client API. Note that we do not specify `--name` here. 
D
Dong Daxiang 已提交
160 161 162
``` shell
python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9292
```
D
Dong Daxiang 已提交
163

D
Dong Daxiang 已提交
164
``` python
D
Dong Daxiang 已提交
165
# A user can visit rpc service through paddle_serving_client API
D
Dong Daxiang 已提交
166 167 168
from paddle_serving_client import Client

client = Client()
D
Dong Daxiang 已提交
169
client.load_client_config("uci_housing_client/serving_client_conf.prototxt")
D
Dong Daxiang 已提交
170
client.connect(["127.0.0.1:9292"])
D
Dong Daxiang 已提交
171
data = [0.0137, -0.1136, 0.2553, -0.0692, 0.0582, -0.0727,
D
Dong Daxiang 已提交
172
        -0.1583, -0.0584, 0.6283, 0.4919, 0.1856, 0.0795, -0.0332]
D
Dong Daxiang 已提交
173
fetch_map = client.predict(feed={"x": data}, fetch=["price"])
D
Dong Daxiang 已提交
174
print(fetch_map)
D
Dong Daxiang 已提交
175 176

```
D
Dong Daxiang 已提交
177
Here, `client.predict` function has two arguments. `feed` is a `python dict` with model input variable alias name and values. `fetch` assigns the prediction variables to be returned from servers. In the example, the name of `"x"` and `"price"` are assigned when the servable model is saved during training.
D
Dong Daxiang 已提交
178

J
Jiawei Wang 已提交
179
<h3 align="center">More Demos</h3>
D
Dong Daxiang 已提交
180

M
MRXLT 已提交
181 182 183 184 185 186
| Key                | Value                                                        |
| :----------------- | :----------------------------------------------------------- |
| Model Name         | Bert-Base-Baike                                              |
| URL                | [https://paddle-serving.bj.bcebos.com/bert_example/bert_seq128.tar.gz](https://paddle-serving.bj.bcebos.com/bert_example%2Fbert_seq128.tar.gz) |
| Client/Server Code | https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/bert |
| Description        | Get semantic representation from a Chinese Sentence          |
D
Dong Daxiang 已提交
187

D
Dong Daxiang 已提交
188 189


M
MRXLT 已提交
190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246
| Key                | Value                                                        |
| :----------------- | :----------------------------------------------------------- |
| Model Name         | Resnet50-Imagenet                                            |
| URL                | [https://paddle-serving.bj.bcebos.com/imagenet-example/ResNet50_vd.tar.gz](https://paddle-serving.bj.bcebos.com/imagenet-example%2FResNet50_vd.tar.gz) |
| Client/Server Code | https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/imagenet |
| Description        | Get image semantic representation from an image              |



| Key                | Value                                                        |
| :----------------- | :----------------------------------------------------------- |
| Model Name         | Resnet101-Imagenet                                           |
| URL                | https://paddle-serving.bj.bcebos.com/imagenet-example/ResNet101_vd.tar.gz |
| Client/Server Code | https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/imagenet |
| Description        | Get image semantic representation from an image              |



| Key                | Value                                                        |
| :----------------- | :----------------------------------------------------------- |
| Model Name         | CNN-IMDB                                                     |
| URL                | https://paddle-serving.bj.bcebos.com/imdb-demo/imdb_model.tar.gz |
| Client/Server Code | https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/imdb |
| Description        | Get category probability from an English Sentence            |



| Key                | Value                                                        |
| :----------------- | :----------------------------------------------------------- |
| Model Name         | LSTM-IMDB                                                    |
| URL                | https://paddle-serving.bj.bcebos.com/imdb-demo/imdb_model.tar.gz |
| Client/Server Code | https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/imdb |
| Description        | Get category probability from an English Sentence            |



| Key                | Value                                                        |
| :----------------- | :----------------------------------------------------------- |
| Model Name         | BOW-IMDB                                                     |
| URL                | https://paddle-serving.bj.bcebos.com/imdb-demo/imdb_model.tar.gz |
| Client/Server Code | https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/imdb |
| Description        | Get category probability from an English Sentence            |



| Key                | Value                                                        |
| :----------------- | :----------------------------------------------------------- |
| Model Name         | Jieba-LAC                                                    |
| URL                | https://paddle-serving.bj.bcebos.com/lac/lac_model.tar.gz    |
| Client/Server Code | https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/lac |
| Description        | Get word segmentation from a Chinese Sentence                |



| Key                | Value                                                        |
| :----------------- | :----------------------------------------------------------- |
| Model Name         | DNN-CTR                                                      |
M
MRXLT 已提交
247
| URL                | https://paddle-serving.bj.bcebos.com/criteo_ctr_example/criteo_ctr_demo_model.tar.gz                            |
M
MRXLT 已提交
248 249 250 251
| Client/Server Code | https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/criteo_ctr |
| Description        | Get click probability from a feature vector of item          |


D
Dong Daxiang 已提交
252
<h2 align="center">Document</h2>
D
Dong Daxiang 已提交
253

D
Dong Daxiang 已提交
254
### New to Paddle Serving
D
Dong Daxiang 已提交
255
- [How to save a servable model?](doc/SAVE.md)
J
Jiawei Wang 已提交
256
- [An End-to-end tutorial from training to inference service deployment](doc/TRAIN_TO_SERVICE.md)
J
Jiawei Wang 已提交
257
- [Write Bert-as-Service in 10 minutes](doc/BERT_10_MINS.md)
D
Dong Daxiang 已提交
258

D
Dong Daxiang 已提交
259
### Developers
D
Dong Daxiang 已提交
260
- [How to config Serving native operators on server side?](doc/SERVER_DAG.md)
J
Jiawei Wang 已提交
261
- [How to develop a new Serving operator?](doc/NEW_OPERATOR.md)
B
barrierye 已提交
262
- [How to develop a new Web Service?](doc/NEW_WEB_SERVICE.md)
D
Dong Daxiang 已提交
263
- [Golang client](doc/IMDB_GO_CLIENT.md)
J
Jiawei Wang 已提交
264
- [Compile from source code](doc/COMPILE.md)
M
MRXLT 已提交
265 266
- [Deploy Web Service with uWSGI](doc/UWSGI_DEPLOY.md)
- [Hot loading for model file](doc/HOT_LOADING_IN_SERVING.md)
D
Dong Daxiang 已提交
267

D
Dong Daxiang 已提交
268
### About Efficiency
M
MRXLT 已提交
269
- [How to profile Paddle Serving latency?](python/examples/util)
M
MRXLT 已提交
270 271
- [How to optimize performance?(Chinese)](doc/PERFORMANCE_OPTIM_CN.md)
- [Deploy multi-services on one GPU(Chinese)](doc/MULTI_SERVICE_ON_ONE_GPU_CN.md)
J
Jiawei Wang 已提交
272 273
- [CPU Benchmarks(Chinese)](doc/BENCHMARKING.md)
- [GPU Benchmarks(Chinese)](doc/GPU_BENCHMARKING.md)
D
Dong Daxiang 已提交
274

D
Dong Daxiang 已提交
275
### FAQ
W
wangjiawei04 已提交
276
- [FAQ(Chinese)](doc/deprecated/FAQ.md)
D
Dong Daxiang 已提交
277

D
Dong Daxiang 已提交
278

D
Dong Daxiang 已提交
279
### Design
J
Jiawei Wang 已提交
280
- [Design Doc](doc/DESIGN_DOC.md)
D
Dong Daxiang 已提交
281

D
Dong Daxiang 已提交
282 283 284
<h2 align="center">Community</h2>

### Slack
D
Dong Daxiang 已提交
285

D
Dong Daxiang 已提交
286 287
To connect with other users and contributors, welcome to join our [Slack channel](https://paddleserving.slack.com/archives/CUBPKHKMJ)

D
Dong Daxiang 已提交
288
### Contribution
D
Dong Daxiang 已提交
289

D
Dong Daxiang 已提交
290
If you want to contribute code to Paddle Serving, please reference [Contribution Guidelines](doc/CONTRIBUTE.md)
D
Dong Daxiang 已提交
291 292

### Feedback
D
Dong Daxiang 已提交
293

D
Dong Daxiang 已提交
294 295
For any feedback or to report a bug, please propose a [GitHub Issue](https://github.com/PaddlePaddle/Serving/issues).

D
Dong Daxiang 已提交
296 297
### License

D
Dong Daxiang 已提交
298
[Apache 2.0 License](https://github.com/PaddlePaddle/Serving/blob/develop/LICENSE)