提交 74c11bc8 编写于 作者: H HexToString

fix doc style for ocr

上级 088dc39b
...@@ -119,6 +119,7 @@ you can execute `make install` to put targets under directory `./output`, you ne ...@@ -119,6 +119,7 @@ you can execute `make install` to put targets under directory `./output`, you ne
### Compile C++ Server under the condition of WITH_OPENCV=ON ### Compile C++ Server under the condition of WITH_OPENCV=ON
First of all , opencv library should be installed, if not, please refer to the `Compile and install opencv` section later in this article. First of all , opencv library should be installed, if not, please refer to the `Compile and install opencv` section later in this article.
In the compile command, add `DOPENCV_DIR=${OPENCV_DIR}` and `DWITH_OPENCV=ON`,for example: In the compile command, add `DOPENCV_DIR=${OPENCV_DIR}` and `DWITH_OPENCV=ON`,for example:
``` shell ``` shell
OPENCV_DIR=your_opencv_dir #`your_opencv_dir` is the installation path of OpenCV library。 OPENCV_DIR=your_opencv_dir #`your_opencv_dir` is the installation path of OpenCV library。
......
...@@ -118,6 +118,7 @@ make -j10 ...@@ -118,6 +118,7 @@ make -j10
### 开启WITH_OPENCV选项编译C++ Server ### 开启WITH_OPENCV选项编译C++ Server
编译Serving C++ Server部分,开启WITH_OPENCV选项时,需要安装安装openCV库,若没有可参考本文档后面的说明编译安装openCV库。 编译Serving C++ Server部分,开启WITH_OPENCV选项时,需要安装安装openCV库,若没有可参考本文档后面的说明编译安装openCV库。
在编译命令中,加入`DOPENCV_DIR=${OPENCV_DIR}``DWITH_OPENCV=ON`选项,例如: 在编译命令中,加入`DOPENCV_DIR=${OPENCV_DIR}``DWITH_OPENCV=ON`选项,例如:
``` shell ``` shell
OPENCV_DIR=your_opencv_dir #`your_opencv_dir`为opencv库的安装路径。 OPENCV_DIR=your_opencv_dir #`your_opencv_dir`为opencv库的安装路径。
......
...@@ -19,6 +19,7 @@ tar xf test_imgs.tar ...@@ -19,6 +19,7 @@ tar xf test_imgs.tar
### Start Service ### Start Service
Select a startup mode according to CPU / GPU device Select a startup mode according to CPU / GPU device
After the -- model parameter, the folder path of multiple model files is passed in to start the prediction service of multiple model concatenation. After the -- model parameter, the folder path of multiple model files is passed in to start the prediction service of multiple model concatenation.
``` ```
#for cpu user #for cpu user
...@@ -28,8 +29,10 @@ python -m paddle_serving_server_gpu.serve --model ocr_det_model ocr_rec_model -- ...@@ -28,8 +29,10 @@ python -m paddle_serving_server_gpu.serve --model ocr_det_model ocr_rec_model --
``` ```
### Client Prediction ### Client Prediction
The pre-processing and post-processing is in the C + + server part, the image's Base64 encoded string is passed into the C + + server The pre-processing and post-processing is in the C + + server part, the image's Base64 encoded string is passed into the C + + server.
so the value of parameter `feed_var` which is in the file `ocr_det_client/serving_client_conf.prototxt` should be changed. so the value of parameter `feed_var` which is in the file `ocr_det_client/serving_client_conf.prototxt` should be changed.
for this case, `feed_type` should be 3(which means the data type is string),`shape` should be 1. for this case, `feed_type` should be 3(which means the data type is string),`shape` should be 1.
By passing in multiple client folder paths, the client can be started for multi model prediction. By passing in multiple client folder paths, the client can be started for multi model prediction.
......
...@@ -19,6 +19,7 @@ tar xf test_imgs.tar ...@@ -19,6 +19,7 @@ tar xf test_imgs.tar
### 启动服务 ### 启动服务
根据CPU/GPU设备选择一种启动方式 根据CPU/GPU设备选择一种启动方式
通过--model后,指定多个模型文件的文件夹路径来启动多模型串联的预测服务。 通过--model后,指定多个模型文件的文件夹路径来启动多模型串联的预测服务。
``` ```
#for cpu user #for cpu user
...@@ -29,8 +30,11 @@ python -m paddle_serving_server_gpu.serve --model ocr_det_model ocr_rec_model -- ...@@ -29,8 +30,11 @@ python -m paddle_serving_server_gpu.serve --model ocr_det_model ocr_rec_model --
### 启动客户端 ### 启动客户端
由于需要在C++Server部分进行前后处理,传入C++Server的仅仅是图片的base64编码的字符串,故第一个模型的Client配置需要修改 由于需要在C++Server部分进行前后处理,传入C++Server的仅仅是图片的base64编码的字符串,故第一个模型的Client配置需要修改
`ocr_det_client/serving_client_conf.prototxt``feed_var`字段 `ocr_det_client/serving_client_conf.prototxt``feed_var`字段
对于本示例而言,`feed_type`应修改为3(数据类型为string),`shape`为1. 对于本示例而言,`feed_type`应修改为3(数据类型为string),`shape`为1.
通过在客户端启动后加入多个client模型的client配置文件夹路径,启动client进行预测。 通过在客户端启动后加入多个client模型的client配置文件夹路径,启动client进行预测。
``` ```
python ocr_c_client_bytes.py ocr_det_client ocr_rec_client python ocr_c_client_bytes.py ocr_det_client ocr_rec_client
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册