README.md 1.4 KB
Newer Older
B
barriery 已提交
1
# OCR Pipeline WebService 
B
barriery 已提交
2 3 4

(English|[简体中文](./README_CN.md))

B
barriery 已提交
5
This document will take OCR as an example to show how to use Pipeline WebService to start multi-model tandem services.
B
barriery 已提交
6

B
barriery 已提交
7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
## Get Model
```
python -m paddle_serving_app.package --get_model ocr_rec
tar -xzvf ocr_rec.tar.gz
python -m paddle_serving_app.package --get_model ocr_det
tar -xzvf ocr_det.tar.gz
```

## Get Dataset (Optional)
```
wget --no-check-certificate https://paddle-serving.bj.bcebos.com/ocr/test_imgs.tar
tar xf test_imgs.tar
```

## Start Service
B
barriery 已提交
22 23 24 25 26 27 28 29 30 31 32 33 34
```
python web_service.py &>log.txt &
```

## Test
```
python pipeline_http_client.py
```



<!--
## More (PipelineServing)
B
barriery 已提交
35

B
barriery 已提交
36
You can choose one of the following versions to start Service.
B
barriery 已提交
37 38

### Remote Service Version
B
barriery 已提交
39
```
B
barriery 已提交
40 41
python -m paddle_serving_server_gpu.serve --model ocr_det_model --port 12000 --gpu_id 0 &> det.log &
python -m paddle_serving_server_gpu.serve --model ocr_rec_model --port 12001 --gpu_id 0 &> rec.log &
B
barriery 已提交
42
python remote_service_pipeline_server.py &>pipeline.log &
B
barriery 已提交
43 44
```

B
barriery 已提交
45 46 47 48 49 50 51 52 53 54 55
### Local Service Version
```
python local_service_pipeline_server.py &>pipeline.log &
```

### Hybrid Service Version
```
python -m paddle_serving_server_gpu.serve --model ocr_rec_model --port 12001 --gpu_id 0 &> rec.log &
python hybrid_service_pipeline_server.py &>pipeline.log &
```

B
barriery 已提交
56
## Client Prediction
B
barriery 已提交
57 58 59 60 61 62 63

### RPC
```
python pipeline_rpc_client.py
```

### HTTP
B
barriery 已提交
64
```
B
barriery 已提交
65
python pipeline_http_client.py
B
barriery 已提交
66
```
B
barriery 已提交
67
-->