diff --git a/deploy/lite/readme.md b/deploy/lite/readme.md index 219cc83fe4487400e886e47e46cb30275ba72c14..bb52a02ea497a0611ea8210b5d5b4a9f9d429828 100644 --- a/deploy/lite/readme.md +++ b/deploy/lite/readme.md @@ -15,10 +15,9 @@ Paddle Lite是飞桨轻量化推理引擎,为手机、IOT端提供高效推理 交叉编译环境用于编译 Paddle Lite 和 PaddleOCR 的C++ demo。 支持多种开发环境,不同开发环境的编译流程请参考对应文档。 -1. [Docker](https://paddle-lite.readthedocs.io/zh/latest/user_guides/source_compile.html#docker) -2. [Linux](https://paddle-lite.readthedocs.io/zh/latest/user_guides/source_compile.html#android) -3. [MAC OS](https://paddle-lite.readthedocs.io/zh/latest/user_guides/source_compile.html#id13) -4. [Windows](https://paddle-lite.readthedocs.io/zh/latest/demo_guides/x86.html#id4) +1. [Docker](https://paddle-lite.readthedocs.io/zh/latest/source_compile/compile_env.html#docker) +2. [Linux](https://paddle-lite.readthedocs.io/zh/latest/source_compile/compile_env.html#linux) +3. [MAC OS](https://paddle-lite.readthedocs.io/zh/latest/source_compile/compile_env.html#mac-os) ### 1.2 准备预测库 @@ -28,15 +27,15 @@ Paddle Lite是飞桨轻量化推理引擎,为手机、IOT端提供高效推理 |-|-| |Android|[arm7](https://paddlelite-data.bj.bcebos.com/Release/2.6.1/Android/inference_lite_lib.android.armv7.gcc.c++_static.with_extra.CV_ON.tar.gz) / [arm8](https://paddlelite-data.bj.bcebos.com/Release/2.6.1/Android/inference_lite_lib.android.armv8.gcc.c++_static.with_extra.CV_ON.tar.gz)| |IOS|[arm7](https://paddlelite-data.bj.bcebos.com/Release/2.6.1/iOS/inference_lite_lib.ios.armv7.with_extra.CV_ON.tar.gz) / [arm8](https://paddlelite-data.bj.bcebos.com/Release/2.6.1/iOS/inference_lite_lib.ios64.armv8.with_extra.CV_ON.tar.gz)| - |x86(Linux)|[预测库](https://paddlelite-data.bj.bcebos.com/Release/2.6.1/X86/Linux/inference_lite_lib.x86.linux.tar.gz)| - 注:如果是从下Paddle-Lite[官网文档](https://paddle-lite.readthedocs.io/zh/latest/user_guides/release_lib.html#android-toolchain-gcc)下载的预测库, - 注意选择`with_extra=ON,with_cv=ON`的下载链接。 + 注:1. 如果是从下Paddle-Lite[官网文档](https://paddle-lite.readthedocs.io/zh/latest/user_guides/release_lib.html#android-toolchain-gcc)下载的预测库, + 注意选择`with_extra=ON,with_cv=ON`的下载链接。2. 如果使用量化的模型部署在端侧,建议使用Paddle-Lite develop分支编译预测库。 -- 2. 编译Paddle-Lite得到预测库,Paddle-Lite的编译方式如下: +- 2. [建议]编译Paddle-Lite得到预测库,Paddle-Lite的编译方式如下: ``` git clone https://github.com/PaddlePaddle/Paddle-Lite.git cd Paddle-Lite +# 务必使用develop分支编译预测库 git checkout develop ./lite/tools/build_android.sh --arch=armv8 --with_cv=ON --with_extra=ON ``` @@ -82,9 +81,12 @@ Paddle-Lite 提供了多种策略来自动优化原始的模型,其中包括 下述表格中提供了优化好的超轻量中文模型: -|模型简介|检测模型|识别模型|Paddle-Lite版本| -|-|-|-|-| -|超轻量级中文OCR opt优化模型|[下载地址](https://paddleocr.bj.bcebos.com/deploy/lite/ch_det_mv3_db_opt.nb)|[下载地址](https://paddleocr.bj.bcebos.com/deploy/lite/ch_rec_mv3_crnn_opt.nb)|develop| +|模型版本|模型简介|模型大小|检测模型|文本方向分类模型|识别模型|Paddle-Lite版本| +|-|-|-|-|-|-|-| +|V1.1|超轻量中文OCR 移动端模型|3.0M|[下载地址](https://paddleocr.bj.bcebos.com/20-09-22/mobile-slim/det/ch_ppocr_mobile_v1.1_det_prune_opt.nb)|[下载地址](https://paddleocr.bj.bcebos.com/20-09-22/cls/ch_ppocr_mobile_cls_quant_opt.nb)|[下载地址](https://paddleocr.bj.bcebos.com/20-09-22/mobile-slim/rec/ch_ppocr_mobile_v1.1_rec_quant_opt.nb)|develop| +|V1.0|轻量级中文OCR 移动端模型|8.6M|[下载地址](https://paddleocr.bj.bcebos.com/deploy/lite/ch_det_mv3_db_opt.nb)|---|[下载地址](https://paddleocr.bj.bcebos.com/deploy/lite/ch_rec_mv3_crnn_opt.nb)|develop| + +注意:V1.1 3.0M 轻量模型是使用PaddleSlim优化后的,需要配合Paddle-Lite最新预测库使用。 如果直接使用上述表格中的模型进行部署,可略过下述步骤,直接阅读 [2.2节](#2.2与手机联调)。 @@ -182,7 +184,7 @@ wget https://paddleocr.bj.bcebos.com/ch_models/ch_rec_mv3_crnn_infer.tar && tar ``` 准备测试图像,以`PaddleOCR/doc/imgs/11.jpg`为例,将测试的图像复制到`demo/cxx/ocr/debug/`文件夹下。 - 准备lite opt工具优化后的模型文件,`ch_det_mv3_db_opt.nb,ch_rec_mv3_crnn_opt.nb`,放置在`demo/cxx/ocr/debug/`文件夹下。 + 准备lite opt工具优化后的模型文件,比如使用`ch_det_mv3_db_opt.nb,ch_rec_mv3_crnn_opt.nb`,模型文件放置在`demo/cxx/ocr/debug/`文件夹下。 执行完成后,ocr文件夹下将有如下文件格式: diff --git a/deploy/lite/readme_en.md b/deploy/lite/readme_en.md index bf2f4b2212bf4d362fe2127da66127ab05766668..ec7d1422eb82793c3e2958a9b22c5780fb936657 100644 --- a/deploy/lite/readme_en.md +++ b/deploy/lite/readme_en.md @@ -14,10 +14,9 @@ deployment solutions for end-side deployment issues. - Mobile phone (arm7 or arm8) ## 2. Build PaddleLite library -[build for Docker](https://paddle-lite.readthedocs.io/zh/latest/user_guides/source_compile.html#docker) -[build for Linux](https://paddle-lite.readthedocs.io/zh/latest/user_guides/source_compile.html#android) -[build for MAC OS](https://paddle-lite.readthedocs.io/zh/latest/user_guides/source_compile.html#id13) -[build for windows](https://paddle-lite.readthedocs.io/zh/latest/demo_guides/x86.html#id4) +1. [Docker](https://paddle-lite.readthedocs.io/zh/latest/source_compile/compile_env.html#docker) +2. [Linux](https://paddle-lite.readthedocs.io/zh/latest/source_compile/compile_env.html#linux) +3. [MAC OS](https://paddle-lite.readthedocs.io/zh/latest/source_compile/compile_env.html#mac-os) ## 3. Download prebuild library for android and ios @@ -25,7 +24,8 @@ deployment solutions for end-side deployment issues. |-|-| |Android|[arm7](https://paddlelite-data.bj.bcebos.com/Release/2.6.1/Android/inference_lite_lib.android.armv7.gcc.c++_static.with_extra.CV_ON.tar.gz) / [arm8](https://paddlelite-data.bj.bcebos.com/Release/2.6.1/Android/inference_lite_lib.android.armv8.gcc.c++_static.with_extra.CV_ON.tar.gz)| |IOS|[arm7](https://paddlelite-data.bj.bcebos.com/Release/2.6.1/iOS/inference_lite_lib.ios.armv7.with_extra.CV_ON.tar.gz) / [arm8](https://paddlelite-data.bj.bcebos.com/Release/2.6.1/iOS/inference_lite_lib.ios64.armv8.with_extra.CV_ON.tar.gz)| -|x86(Linux)|[预测库](https://paddlelite-data.bj.bcebos.com/Release/2.6.1/X86/Linux/inference_lite_lib.x86.linux.tar.gz)| + +note: It is recommended to build prebuild library using [Paddle-Lite](https://github.com/PaddlePaddle/Paddle-Lite) develop branch if developer wants to deploy the [quantitative](https://github.com/PaddlePaddle/PaddleOCR/blob/develop/deploy/slim/quantization/README_en.md) model to mobile phone. The structure of the prediction library is as follows: @@ -64,9 +64,10 @@ Paddle Lite provides a variety of strategies to automatically optimize the origi If you use PaddleOCR 8.6M OCR model to deploy, you can directly download the optimized model. -|Introduction|Detection model|Recognition model|Paddle Lite branch | -|-|-|-|-| -|lightweight Chinese OCR optimized model|[Download](https://paddleocr.bj.bcebos.com/deploy/lite/ch_det_mv3_db_opt.nb)|[Download](https://paddleocr.bj.bcebos.com/deploy/lite/ch_rec_mv3_crnn_opt.nb)|develop| +|Version|Introduction|Model size|Detection model|Text Direction model|Recognition model|Paddle Lite branch | +|-|-|-|-|-|-| +|V1.1|extra-lightweight chinese OCR optimized model|3.0M|[Download](https://paddleocr.bj.bcebos.com/20-09-22/mobile-slim/det/ch_ppocr_mobile_v1.1_det_prune_opt.nb)|[Download](https://paddleocr.bj.bcebos.com/20-09-22/cls/ch_ppocr_mobile_cls_quant_opt.nb)|[Download](https://paddleocr.bj.bcebos.com/20-09-22/mobile-slim/rec/ch_ppocr_mobile_v1.1_rec_quant_opt.nb)|develop| +|V1.0|lightweight Chinese OCR optimized model|8.6M|[Download](https://paddleocr.bj.bcebos.com/deploy/lite/ch_det_mv3_db_opt.nb)|---|[Download](https://paddleocr.bj.bcebos.com/deploy/lite/ch_rec_mv3_crnn_opt.nb)|develop| If the model to be deployed is not in the above table, you need to follow the steps below to obtain the optimized model.