README.md 2.5 KB
Newer Older
D
Dong Daxiang 已提交
1
<img src='https://paddle-serving.bj.bcebos.com/imdb-demo%2FLogoMakr-3Bd2NM-300dpi.png' width = "550" height = "170">
D
Dong Daxiang 已提交
2

D
Dong Daxiang 已提交
3
An easy-to-use Machine Learning Model Inference Service Deployment Tool
D
Dong Daxiang 已提交
4

D
Dong Daxiang 已提交
5
[![Release](https://img.shields.io/badge/Release-0.0.3-yellowgreen)](Release)
D
Dong Daxiang 已提交
6
[![Issues](https://img.shields.io/github/issues/PaddlePaddle/Serving)](Issues)
D
Dong Daxiang 已提交
7 8
[![License](https://img.shields.io/github/license/PaddlePaddle/Serving)](LICENSE)

D
Dong Daxiang 已提交
9
[中文](./doc/README_CN.md)
D
Dong Daxiang 已提交
10

D
Dong Daxiang 已提交
11 12
## Motivation
Paddle Serving helps deep learning developers deploy an online inference service without much effort. Currently, Paddle Serving supports the deep learning models trained by [Paddle](https://github.com/PaddlePaddle/Paddle) althought it can be very easy to integrate other deep learning framework's model inference engine.  
D
Dong Daxiang 已提交
13

D
Dong Daxiang 已提交
14
## Key Features
D
Dong Daxiang 已提交
15
- Integrate with Paddle training pipeline seemlessly, most paddle models can be deployed **with one line command**.
D
Dong Daxiang 已提交
16 17 18
- **Industrial serving features** supported, such as models management, online loading, online A/B testing etc.
- **Distributed Key-Value indexing** supported that is especially useful for large scale sparse features as model inputs.
- **Highly concurrent and efficient communication** between clients and servers.
D
Dong Daxiang 已提交
19
- **Multiple programming languages** supported on client side, such as Golang, C++ and python
D
Dong Daxiang 已提交
20

D
Dong Daxiang 已提交
21
## Installation
D
Dong Daxiang 已提交
22 23 24 25 26 27

```shell
pip install paddle-serving-client
pip install paddle-serving-server
```

D
Dong Daxiang 已提交
28 29
## Quick Start Example

D
Dong Daxiang 已提交
30
``` shell
D
Dong Daxiang 已提交
31
wget --no-check-certificate https://paddle-serving.bj.bcebos.com/uci_housing.tar.gz
D
Dong Daxiang 已提交
32
tar -xzf uci_housing.tar.gz
D
Dong Daxiang 已提交
33
python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9292
D
Dong Daxiang 已提交
34 35
```

D
Dong Daxiang 已提交
36
### Python Client Request
D
Dong Daxiang 已提交
37

D
Dong Daxiang 已提交
38
``` python
D
Dong Daxiang 已提交
39 40 41
from paddle_serving_client import Client

client = Client()
D
Dong Daxiang 已提交
42
client.load_client_config("uci_housing_client")
D
Dong Daxiang 已提交
43
client.connect(["127.0.0.1:9292"])
D
Dong Daxiang 已提交
44 45 46 47
data = [0.0137, -0.1136, 0.2553, -0.0692, 0.0582, -0.0727,
        -0.1583 -0.0584 0.6283 0.4919 0.1856 0.0795 -0.0332]
fetch_map = client.predict(feed={"x": data[0][0]}, fetch=["price"])
print(fetch_map)
D
Dong Daxiang 已提交
48 49 50 51 52 53 54 55 56

```



### Document

[Design Doc(Chinese)](doc/DESIGN.md)

D
Dong Daxiang 已提交
57 58 59 60 61 62
[How to config Serving native operators on server side?](doc/SERVER_OP.md)

[How to develop a new Serving operator](doc/OPERATOR.md)

[Client API for other programming languages](doc/CLIENT_API.md)

D
Dong Daxiang 已提交
63 64 65 66
[FAQ(Chinese)](doc/FAQ.md)

### Advanced features and development

D
Dong Daxiang 已提交
67
[Compile from source code(Chinese)](doc/COMPILE.md)
D
Dong Daxiang 已提交
68 69 70

## Contribution

D
Dong Daxiang 已提交
71
If you want to contribute code to Paddle Serving, please reference [Contribution Guidelines](doc/CONTRIBUTE.md)