“f30613de6c3a80565f11182ec36c0265b5ea21ce”上不存在“src/git@gitcode.net:taosdata/tdengine.git”
未验证 提交 11b27b41 编写于 作者: T topduke 提交者: GitHub

Merge branch 'dygraph' into dygraph

...@@ -76,7 +76,7 @@ LK-PAN (Large Kernel PAN) is a lightweight [PAN](https://arxiv.org/pdf/1803.0153 ...@@ -76,7 +76,7 @@ LK-PAN (Large Kernel PAN) is a lightweight [PAN](https://arxiv.org/pdf/1803.0153
**(2) DML: Deep Mutual Learning Strategy for Teacher Model** **(2) DML: Deep Mutual Learning Strategy for Teacher Model**
[DML](https://arxiv.org/abs/1706.00384)(Collaborative Mutual Learning), as shown in the figure below, can effectively improve the accuracy of the text detection model by learning from each other with two models with the same structure. The DML strategy is adopted in the teacher model training, and the hmean is increased from 85% to 86%. By updating the teacher model of CML in PP-OCRv2 to the above-mentioned higher-precision one, the hmean of the student model can be further improved from 83.2% to 84.3%. [DML](https://arxiv.org/abs/1706.00384)(Deep Mutual Learning), as shown in the figure below, can effectively improve the accuracy of the text detection model by learning from each other with two models with the same structure. The DML strategy is adopted in the teacher model training, and the hmean is increased from 85% to 86%. By updating the teacher model of CML in PP-OCRv2 to the above-mentioned higher-precision one, the hmean of the student model can be further improved from 83.2% to 84.3%.
<div align="center"> <div align="center">
......
===========================cpp_infer_params===========================
model_name:ch_PP-OCRv2
use_opencv:True
infer_model:./inference/ch_PP-OCRv2_det_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
--rec_batch_num:1
--use_tensorrt:False
--precision:fp32
--det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
--rec_model_dir:./inference/ch_PP-OCRv2_rec_infer/
--benchmark:True
--det:True
--rec:True
--cls:False
--use_angle_cls:False
\ No newline at end of file
===========================paddle2onnx_params===========================
model_name:ch_PP-OCRv2
python:python3.7
2onnx: paddle2onnx
--det_model_dir:./inference/ch_PP-OCRv2_det_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--det_save_file:./inference/det_v2_onnx/model.onnx
--rec_model_dir:./inference/ch_PP-OCRv2_rec_infer/
--rec_save_file:./inference/rec_v2_onnx/model.onnx
--opset_version:10
--enable_onnx_checker:True
inference:tools/infer/predict_system.py --rec_image_shape="3,32,320"
--use_gpu:True|False
--det_model_dir:
--rec_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
\ No newline at end of file
===========================cpp_infer_params===========================
model_name:ch_PP-OCRv2_det
use_opencv:True
infer_model:./inference/ch_PP-OCRv2_det_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
--rec_batch_num:1
--use_tensorrt:False
--precision:fp32
--det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
null:null
--benchmark:True
--det:True
--rec:False
--cls:False
--use_angle_cls:False
\ No newline at end of file
===========================paddle2onnx_params===========================
model_name:ch_PP-OCRv2_det
python:python3.7
2onnx: paddle2onnx
--det_model_dir:./inference/ch_PP-OCRv2_det_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--det_save_file:./inference/det_v2_onnx/model.onnx
--rec_model_dir:
--rec_save_file:
--opset_version:10
--enable_onnx_checker:True
inference:tools/infer/predict_det.py
--use_gpu:True|False
--det_model_dir:
--rec_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
\ No newline at end of file
===========================train_params=========================== ===========================train_params===========================
model_name:ch_PPOCRv2_det model_name:ch_PP-OCRv2_det
python:python3.7 python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:fp32 Global.auto_cast:fp32
Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=500 Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_whole_infer=4 Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_whole_infer=4
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
===========================train_params=========================== ===========================train_params===========================
model_name:ch_PPOCRv2_det_PACT model_name:ch_PP-OCRv2_det_PACT
python:python3.7 python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:fp32 Global.auto_cast:fp32
Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=500 Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=1|whole_train_whole_infer=4 Train.loader.batch_size_per_card:lite_train_lite_infer=1|whole_train_whole_infer=4
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
===========================cpp_infer_params===========================
model_name:ch_PP-OCRv2_rec
use_opencv:True
infer_model:./inference/ch_PP-OCRv2_rec_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
--rec_batch_num:6
--use_tensorrt:False
--precision:fp32
--rec_model_dir:
--image_dir:./inference/rec_inference/
null:null
--benchmark:True
--det:False
--rec:True
--cls:False
--use_angle_cls:False
\ No newline at end of file
===========================paddle2onnx_params===========================
model_name:ch_PP-OCRv2_rec
python:python3.7
2onnx: paddle2onnx
--det_model_dir:
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--det_save_file:
--rec_model_dir:./inference/ch_PP-OCRv2_rec_infer/
--rec_save_file:./inference/rec_v2_onnx/model.onnx
--opset_version:10
--enable_onnx_checker:True
inference:tools/infer/predict_rec.py --rec_image_shape="3,32,320"
--use_gpu:True|False
--det_model_dir:
--rec_model_dir:
--image_dir:./inference/rec_inference/
\ No newline at end of file
===========================train_params=========================== ===========================train_params===========================
model_name:PPOCRv2_ocr_rec model_name:ch_PP-OCRv2_rec
python:python3.7 python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:fp32 Global.auto_cast:fp32
Global.epoch_num:lite_train_lite_infer=3|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=3|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=16|whole_train_whole_infer=128 Train.loader.batch_size_per_card:lite_train_lite_infer=16|whole_train_whole_infer=128
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py inference:tools/infer/predict_rec.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
null:null null:null
......
===========================train_params=========================== ===========================train_params===========================
model_name:ch_PPOCRv2_rec_PACT model_name:ch_PP-OCRv2_rec_PACT
python:python3.7 python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:fp32 Global.auto_cast:fp32
Global.epoch_num:lite_train_lite_infer=6|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=6|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=16|whole_train_whole_infer=128 Train.loader.batch_size_per_card:lite_train_lite_infer=16|whole_train_whole_infer=128
Global.pretrained_model:pretrain_models/ch_PP-OCRv2_rec_train/best_accuracy Global.pretrained_model:pretrain_models/ch_PP-OCRv2_rec_train/best_accuracy
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:True infer_quant:True
inference:tools/infer/predict_rec.py inference:tools/infer/predict_rec.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
null:null null:null
......
===========================cpp_infer_params=========================== ===========================cpp_infer_params===========================
model_name:ocr_system_v3 model_name:ch_PP-OCRv3
use_opencv:True use_opencv:True
infer_model:./inference/ch_PP-OCRv3_det_infer/ infer_model:./inference/ch_PP-OCRv3_det_infer/
infer_quant:False infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_img_h=48 --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt inference:./deploy/cpp_infer/build/ppocr --rec_img_h=48 --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
--rec_model_dir:./inference/ch_PP-OCRv3_rec_infer/ --rec_model_dir:./inference/ch_PP-OCRv3_rec_infer/
......
===========================paddle2onnx_params===========================
model_name:ch_PP-OCRv3
python:python3.7
2onnx: paddle2onnx
--det_model_dir:./inference/ch_PP-OCRv3_det_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--det_save_file:./inference/det_v3_onnx/model.onnx
--rec_model_dir:./inference/ch_PP-OCRv3_rec_infer/
--rec_save_file:./inference/rec_v3_onnx/model.onnx
--opset_version:10
--enable_onnx_checker:True
inference:tools/infer/predict_system.py --rec_image_shape="3,48,320"
--use_gpu:True|False
--det_model_dir:
--rec_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
\ No newline at end of file
===========================cpp_infer_params=========================== ===========================cpp_infer_params===========================
model_name:ocr_det_v3 model_name:ch_PP-OCRv3_det
use_opencv:True use_opencv:True
infer_model:./inference/ch_PP-OCRv3_det_infer/ infer_model:./inference/ch_PP-OCRv3_det_infer/
infer_quant:False infer_quant:False
inference:./deploy/cpp_infer/build/ppocr inference:./deploy/cpp_infer/build/ppocr
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
===========================paddle2onnx_params=========================== ===========================paddle2onnx_params===========================
model_name:ocr_det_v3 model_name:ch_PP-OCRv3_det
python:python3.7 python:python3.7
2onnx: paddle2onnx 2onnx: paddle2onnx
--model_dir:./inference/ch_PP-OCRv3_det_infer/ --det_model_dir:./inference/ch_PP-OCRv3_det_infer/
--model_filename:inference.pdmodel --model_filename:inference.pdmodel
--params_filename:inference.pdiparams --params_filename:inference.pdiparams
--save_file:./inference/det_v3_onnx/model.onnx --det_save_file:./inference/det_v3_onnx/model.onnx
--rec_model_dir:
--rec_save_file:
--opset_version:10 --opset_version:10
--enable_onnx_checker:True --enable_onnx_checker:True
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--det_model_dir: --det_model_dir:
--rec_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
\ No newline at end of file
===========================train_params=========================== ===========================train_params===========================
model_name:ch_PPOCRv3_det model_name:ch_PP-OCRv3_det
python:python3.7 python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:fp32 Global.auto_cast:fp32
Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=500 Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_whole_infer=4 Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_whole_infer=4
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
===========================train_params=========================== ===========================train_params===========================
model_name:ch_PPOCRv3_det_PACT model_name:ch_PP-OCRv3_det_PACT
python:python3.7 python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:fp32 Global.auto_cast:fp32
Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=500 Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=1|whole_train_whole_infer=4 Train.loader.batch_size_per_card:lite_train_lite_infer=1|whole_train_whole_infer=4
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
===========================cpp_infer_params=========================== ===========================cpp_infer_params===========================
model_name:ocr_rec_v3 model_name:ch_PP-OCRv3_rec
use_opencv:True use_opencv:True
infer_model:./inference/ch_PP-OCRv3_rec_infer/ infer_model:./inference/ch_PP-OCRv3_rec_infer/
infer_quant:False infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_img_h=48 --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt inference:./deploy/cpp_infer/build/ppocr --rec_img_h=48 --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:6 --rec_batch_num:6
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference/ --image_dir:./inference/rec_inference/
null:null null:null
......
===========================paddle2onnx_params=========================== ===========================paddle2onnx_params===========================
model_name:ocr_rec_v3 model_name:ch_PP-OCRv3_rec
python:python3.7 python:python3.7
2onnx: paddle2onnx 2onnx: paddle2onnx
--model_dir:./inference/ch_PP-OCRv3_rec_infer/ --det_model_dir:
--model_filename:inference.pdmodel --model_filename:inference.pdmodel
--params_filename:inference.pdiparams --params_filename:inference.pdiparams
--save_file:./inference/rec_v3_onnx/model.onnx --det_save_file:
--rec_model_dir:./inference/ch_PP-OCRv3_rec_infer/
--rec_save_file:./inference/rec_v3_onnx/model.onnx
--opset_version:10 --opset_version:10
--enable_onnx_checker:True --enable_onnx_checker:True
inference:tools/infer/predict_rec.py --rec_image_shape="3,48,320" inference:tools/infer/predict_rec.py --rec_image_shape="3,48,320"
--use_gpu:True|False --use_gpu:True|False
--det_model_dir:
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference/
\ No newline at end of file \ No newline at end of file
===========================train_params=========================== ===========================train_params===========================
model_name:PPOCRv3_ocr_rec model_name:ch_PP-OCRv3_rec
python:python3.7 python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:fp32 Global.auto_cast:fp32
Global.epoch_num:lite_train_lite_infer=3|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=3|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=16|whole_train_whole_infer=128 Train.loader.batch_size_per_card:lite_train_lite_infer=16|whole_train_whole_infer=128
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_image_shape="3,48,320" inference:tools/infer/predict_rec.py --rec_image_shape="3,48,320"
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
null:null null:null
......
===========================train_params=========================== ===========================train_params===========================
model_name:ch_PPOCRv3_rec_PACT model_name:ch_PP-OCRv3_rec_PACT
python:python3.7 python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:fp32 Global.auto_cast:fp32
Global.epoch_num:lite_train_lite_infer=6|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=6|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=16|whole_train_whole_infer=128 Train.loader.batch_size_per_card:lite_train_lite_infer=16|whole_train_whole_infer=128
Global.pretrained_model:pretrain_models/ch_PP-OCRv3_rec_train/best_accuracy Global.pretrained_model:pretrain_models/ch_PP-OCRv3_rec_train/best_accuracy
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:True infer_quant:True
inference:tools/infer/predict_rec.py --rec_image_shape="3,48,320" inference:tools/infer/predict_rec.py --rec_image_shape="3,48,320"
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
null:null null:null
......
===========================cpp_infer_params===========================
model_name:ch_ppocr_mobile_v2.0
use_opencv:True
infer_model:./inference/ch_ppocr_mobile_v2.0_det_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
--rec_batch_num:1
--use_tensorrt:False
--precision:fp32
--det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
--rec_model_dir:./inference/ch_ppocr_mobile_v2.0_rec_infer/
--benchmark:True
--det:True
--rec:True
--cls:False
--use_angle_cls:False
\ No newline at end of file
===========================paddle2onnx_params===========================
model_name:ch_ppocr_mobile_v2.0
python:python3.7
2onnx: paddle2onnx
--det_model_dir:./inference/ch_ppocr_mobile_v2.0_det_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--det_save_file:./inference/det_mobile_onnx/model.onnx
--rec_model_dir:./inference/ch_ppocr_mobile_v2.0_rec_infer/
--rec_save_file:./inference/rec_mobile_onnx/model.onnx
--opset_version:10
--enable_onnx_checker:True
inference:tools/infer/predict_system.py --rec_image_shape="3,32,320"
--use_gpu:True|False
--det_model_dir:
--rec_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
\ No newline at end of file
===========================cpp_infer_params=========================== ===========================cpp_infer_params===========================
model_name:ocr_det model_name:ch_ppocr_mobile_v2.0_det
use_opencv:True use_opencv:True
infer_model:./inference/ch_ppocr_mobile_v2.0_det_infer/ infer_model:./inference/ch_ppocr_mobile_v2.0_det_infer/
infer_quant:False infer_quant:False
inference:./deploy/cpp_infer/build/ppocr inference:./deploy/cpp_infer/build/ppocr
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
===========================paddle2onnx_params=========================== ===========================paddle2onnx_params===========================
model_name:ocr_det_mobile model_name:ch_ppocr_mobile_v2.0_det
python:python3.7 python:python3.7
2onnx: paddle2onnx 2onnx: paddle2onnx
--model_dir:./inference/ch_ppocr_mobile_v2.0_det_infer/ --det_model_dir:./inference/ch_ppocr_mobile_v2.0_det_infer/
--model_filename:inference.pdmodel --model_filename:inference.pdmodel
--params_filename:inference.pdiparams --params_filename:inference.pdiparams
--save_file:./inference/det_mobile_onnx/model.onnx --det_save_file:./inference/det_mobile_onnx/model.onnx
--rec_model_dir:
--rec_save_file:
--opset_version:10 --opset_version:10
--enable_onnx_checker:True --enable_onnx_checker:True
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--det_model_dir: --det_model_dir:
--rec_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
\ No newline at end of file
...@@ -4,7 +4,7 @@ python:python3.7 ...@@ -4,7 +4,7 @@ python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:null Global.auto_cast:null
Global.epoch_num:lite_train_lite_infer=100|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=100|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_whole_infer=4 Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_whole_infer=4
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/det/ch_ppocr_v2.0/ch_det_mv3_db_v2 ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/det/ch_ppocr_v2.0/ch_det_mv3_db_v2
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
...@@ -4,7 +4,7 @@ python:python3.7 ...@@ -4,7 +4,7 @@ python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:null Global.auto_cast:null
Global.epoch_num:lite_train_lite_infer=5|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=5|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_whole_infer=4 Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_whole_infer=4
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
...@@ -4,7 +4,7 @@ python:python3.7 ...@@ -4,7 +4,7 @@ python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:null Global.auto_cast:null
Global.epoch_num:lite_train_lite_infer=20|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=20|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_whole_infer=4 Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_whole_infer=4
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
===========================cpp_infer_params===========================
model_name:ch_ppocr_mobile_v2.0_rec
use_opencv:True
infer_model:./inference/ch_ppocr_mobile_v2.0_rec_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
--rec_batch_num:6
--use_tensorrt:False
--precision:fp32
--rec_model_dir:
--image_dir:./inference/rec_inference/
null:null
--benchmark:True
--det:False
--rec:True
--cls:False
--use_angle_cls:False
\ No newline at end of file
===========================paddle2onnx_params=========================== ===========================paddle2onnx_params===========================
model_name:ocr_rec_mobile model_name:ch_ppocr_mobile_v2.0_rec
python:python3.7 python:python3.7
2onnx: paddle2onnx 2onnx: paddle2onnx
--model_dir:./inference/ch_ppocr_mobile_v2.0_rec_infer/ --det_model_dir:
--model_filename:inference.pdmodel --model_filename:inference.pdmodel
--params_filename:inference.pdiparams --params_filename:inference.pdiparams
--save_file:./inference/rec_mobile_onnx/model.onnx --det_save_file:
--rec_model_dir:./inference/ch_ppocr_mobile_v2.0_rec_infer/
--rec_save_file:./inference/rec_mobile_onnx/model.onnx
--opset_version:10 --opset_version:10
--enable_onnx_checker:True --enable_onnx_checker:True
inference:tools/infer/predict_rec.py inference:tools/infer/predict_rec.py --rec_image_shape="3,32,320"
--use_gpu:True|False --use_gpu:True|False
--det_model_dir:
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference/
\ No newline at end of file \ No newline at end of file
...@@ -4,7 +4,7 @@ python:python3.7 ...@@ -4,7 +4,7 @@ python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:null Global.auto_cast:null
Global.epoch_num:lite_train_lite_infer=2|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=2|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=128|whole_train_whole_infer=128 Train.loader.batch_size_per_card:lite_train_lite_infer=128|whole_train_whole_infer=128
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/rec/rec_icdar15_train.yml -o ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/rec/rec_icdar15_train.yml -o
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py inference:tools/infer/predict_rec.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -4,7 +4,7 @@ python:python3.7 ...@@ -4,7 +4,7 @@ python:python3.7
gpu_list:0 gpu_list:0
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:null Global.auto_cast:null
Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=128|whole_train_whole_infer=128 Train.loader.batch_size_per_card:lite_train_lite_infer=128|whole_train_whole_infer=128
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py inference:tools/infer/predict_rec.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
null:null null:null
......
...@@ -4,7 +4,7 @@ python:python3.7 ...@@ -4,7 +4,7 @@ python:python3.7
gpu_list:0 gpu_list:0
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:null Global.auto_cast:null
Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=1|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=128|whole_train_whole_infer=128 Train.loader.batch_size_per_card:lite_train_lite_infer=128|whole_train_whole_infer=128
Global.checkpoints:null Global.checkpoints:null
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt --rec_image_shape="3,32,100" inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt --rec_image_shape="3,32,100"
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
===========================cpp_infer_params===========================
model_name:ch_ppocr_server_v2.0
use_opencv:True
infer_model:./inference/ch_ppocr_server_v2.0_det_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
--rec_batch_num:1
--use_tensorrt:False
--precision:fp32
--det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
--rec_model_dir:./inference/ch_ppocr_server_v2.0_rec_infer/
--benchmark:True
--det:True
--rec:True
--cls:False
--use_angle_cls:False
\ No newline at end of file
===========================paddle2onnx_params===========================
model_name:ch_ppocr_server_v2.0
python:python3.7
2onnx: paddle2onnx
--det_model_dir:./inference/ch_ppocr_server_v2.0_det_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--det_save_file:./inference/det_server_onnx/model.onnx
--rec_model_dir:./inference/ch_ppocr_server_v2.0_rec_infer/
--rec_save_file:./inference/rec_server_onnx/model.onnx
--opset_version:10
--enable_onnx_checker:True
inference:tools/infer/predict_system.py --rec_image_shape="3,32,320"
--use_gpu:True|False
--det_model_dir:
--rec_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
\ No newline at end of file
===========================cpp_infer_params===========================
model_name:ch_ppocr_server_v2.0_det
use_opencv:True
infer_model:./inference/ch_ppocr_server_v2.0_det_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
--rec_batch_num:1
--use_tensorrt:False
--precision:fp32
--det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
null:null
--benchmark:True
--det:True
--rec:False
--cls:False
--use_angle_cls:False
\ No newline at end of file
===========================paddle2onnx_params=========================== ===========================paddle2onnx_params===========================
model_name:ocr_det_server model_name:ch_ppocr_server_v2.0_det
python:python3.7 python:python3.7
2onnx: paddle2onnx 2onnx: paddle2onnx
--model_dir:./inference/ch_ppocr_server_v2.0_det_infer/ --det_model_dir:./inference/ch_ppocr_server_v2.0_det_infer/
--model_filename:inference.pdmodel --model_filename:inference.pdmodel
--params_filename:inference.pdiparams --params_filename:inference.pdiparams
--save_file:./inference/det_server_onnx/model.onnx --det_save_file:./inference/det_server_onnx/model.onnx
--rec_model_dir:
--rec_save_file:
--opset_version:10 --opset_version:10
--enable_onnx_checker:True --enable_onnx_checker:True
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--det_model_dir: --det_model_dir:
--image_dir:./inference/det_inference --rec_model_dir:
\ No newline at end of file --image_dir:./inference/ch_det_data_50/all-sum-510/
\ No newline at end of file
...@@ -4,7 +4,7 @@ python:python3.7 ...@@ -4,7 +4,7 @@ python:python3.7
gpu_list:0|0,1 gpu_list:0|0,1
Global.use_gpu:True|True Global.use_gpu:True|True
Global.auto_cast:null Global.auto_cast:null
Global.epoch_num:lite_train_lite_infer=2|whole_train_whole_infer=300 Global.epoch_num:lite_train_lite_infer=2|whole_train_whole_infer=50
Global.save_model_dir:./output/ Global.save_model_dir:./output/
Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_lite_infer=4 Train.loader.batch_size_per_card:lite_train_lite_infer=2|whole_train_lite_infer=4
Global.pretrained_model:null Global.pretrained_model:null
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/det/ch_ppocr_v2.0/ch_det_res18_db_ ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/det/ch_ppocr_v2.0/ch_det_res18_db_
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
--save_log_path:null --save_log_path:null
......
===========================cpp_infer_params===========================
model_name:ch_ppocr_server_v2.0_rec
use_opencv:True
infer_model:./inference/ch_ppocr_server_v2.0_rec_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
--rec_batch_num:6
--use_tensorrt:False
--precision:fp32
--rec_model_dir:
--image_dir:./inference/rec_inference/
null:null
--benchmark:True
--det:False
--rec:True
--cls:False
--use_angle_cls:False
\ No newline at end of file
===========================paddle2onnx_params=========================== ===========================paddle2onnx_params===========================
model_name:ocr_rec_server model_name:ch_ppocr_server_v2.0_rec
python:python3.7 python:python3.7
2onnx: paddle2onnx 2onnx: paddle2onnx
--model_dir:./inference/ch_ppocr_server_v2.0_rec_infer/ --det_model_dir:
--model_filename:inference.pdmodel --model_filename:inference.pdmodel
--params_filename:inference.pdiparams --params_filename:inference.pdiparams
--save_file:./inference/rec_server_onnx/model.onnx --det_save_file:
--rec_model_dir:./inference/ch_ppocr_server_v2.0_rec_infer/
--rec_save_file:./inference/rec_server_onnx/model.onnx
--opset_version:10 --opset_version:10
--enable_onnx_checker:True --enable_onnx_checker:True
inference:tools/infer/predict_rec.py inference:tools/infer/predict_rec.py --rec_image_shape="3,32,320"
--use_gpu:True|False --use_gpu:True|False
--det_model_dir:
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference/
\ No newline at end of file \ No newline at end of file
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/ch_ppocr_server_v2.0_rec ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/ch_ppocr_server_v2.0_rec
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py inference:tools/infer/predict_rec.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/det/det_mv3_db.yml -o ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/det/det_mv3_db.yml -o
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_mv3_east_v2.0/det_mv ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_mv3_east_v2.0/det_mv
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
--save_log_path:null --save_log_path:null
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_mv3_pse_v2.0/det_mv3 ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_mv3_pse_v2.0/det_mv3
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False --use_tensorrt:False
--precision:fp32|fp16 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
--save_log_path:null --save_log_path:null
......
...@@ -39,11 +39,11 @@ infer_export:null ...@@ -39,11 +39,11 @@ infer_export:null
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
--save_log_path:null --save_log_path:null
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/det/det_r50_vd_db.yml -o ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/det/det_r50_vd_db.yml -o
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False|True --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
--save_log_path:null --save_log_path:null
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_r50_vd_east_v2_0/det ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_r50_vd_east_v2_0/det
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
--save_log_path:null --save_log_path:null
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_r50_vd_pse_v2_0/det_ ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_r50_vd_pse_v2_0/det_
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
--save_log_path:null --save_log_path:null
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_r50_vd_sast_icdar15_ ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_r50_vd_sast_icdar15_
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_r50_vd_sast_totaltex ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/det_r50_vd_sast_totaltex
infer_quant:False infer_quant:False
inference:tools/infer/predict_det.py inference:tools/infer/predict_det.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--det_model_dir: --det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/e2e/e2e_r50_vd_pg.yml -o ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c configs/e2e/e2e_r50_vd_pg.yml -o
infer_quant:False infer_quant:False
inference:tools/infer/predict_e2e.py inference:tools/infer/predict_e2e.py
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1 --rec_batch_num:1
--use_tensorrt:False --use_tensorrt:False
--precision:fp32|fp16|int8 --precision:fp32
--e2e_model_dir: --e2e_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/ --image_dir:./inference/ch_det_data_50/all-sum-510/
null:null null:null
......
...@@ -39,10 +39,10 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_mtb_nrtr/rec_mtb_nrt ...@@ -39,10 +39,10 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_mtb_nrtr/rec_mtb_nrt
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/EN_symbol_dict.txt --rec_image_shape="1,32,100" --rec_algorithm="NRTR" inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/EN_symbol_dict.txt --rec_image_shape="1,32,100" --rec_algorithm="NRTR"
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_mv3_none_bilstm_ctc_ ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_mv3_none_bilstm_ctc_
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100"
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_mv3_none_none_ctc_v2 ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_mv3_none_none_ctc_v2
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100"
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_mv3_tps_bilstm_att_v ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_mv3_tps_bilstm_att_v
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" --rec_algorithm="RARE" --min_subgraph_size=5 inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" --rec_algorithm="RARE" --min_subgraph_size=5
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_mv3_tps_bilstm_ctc_v ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_mv3_tps_bilstm_ctc_v
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" --rec_algorithm="StarNet" inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" --rec_algorithm="StarNet"
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -38,12 +38,12 @@ train_model:./inference/rec_r31_sar_train/best_accuracy ...@@ -38,12 +38,12 @@ train_model:./inference/rec_r31_sar_train/best_accuracy
infer_export:tools/export_model.py -c test_tipc/configs/rec_r31_sar/rec_r31_sar.yml -o infer_export:tools/export_model.py -c test_tipc/configs/rec_r31_sar/rec_r31_sar.yml -o
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/dict90.txt --rec_image_shape="3,48,48,160" --rec_algorithm="SAR" inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/dict90.txt --rec_image_shape="3,48,48,160" --rec_algorithm="SAR"
--use_gpu:True|False --use_gpu:True
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_r34_vd_none_bilstm_c ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_r34_vd_none_bilstm_c
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100"
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_r34_vd_none_none_ctc ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_r34_vd_none_none_ctc
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100"
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_r34_vd_tps_bilstm_at ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_r34_vd_tps_bilstm_at
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" --rec_algorithm="RARE" --min_subgraph_size=5 inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" --rec_algorithm="RARE" --min_subgraph_size=5
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_r34_vd_tps_bilstm_ct ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_r34_vd_tps_bilstm_ct
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" --rec_algorithm="StarNet" inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="3,32,100" --rec_algorithm="StarNet"
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_r50_fpn_vd_none_srn/ ...@@ -39,11 +39,11 @@ infer_export:tools/export_model.py -c test_tipc/configs/rec_r50_fpn_vd_none_srn/
infer_quant:False infer_quant:False
inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="1,64,256" --rec_algorithm="SRN" --use_space_char=False --min_subgraph_size=3 inference:tools/infer/predict_rec.py --rec_char_dict_path=./ppocr/utils/ic15_dict.txt --rec_image_shape="1,64,256" --rec_algorithm="SRN" --use_space_char=False --min_subgraph_size=3
--use_gpu:True|False --use_gpu:True|False
--enable_mkldnn:True|False --enable_mkldnn:False
--cpu_threads:1|6 --cpu_threads:6
--rec_batch_num:1|6 --rec_batch_num:1|6
--use_tensorrt:True|False --use_tensorrt:False
--precision:fp32|int8 --precision:fp32
--rec_model_dir: --rec_model_dir:
--image_dir:./inference/rec_inference --image_dir:./inference/rec_inference
--save_log_path:./test/output/ --save_log_path:./test/output/
......
...@@ -44,11 +44,11 @@ if [ ${MODE} = "lite_train_lite_infer" ];then ...@@ -44,11 +44,11 @@ if [ ${MODE} = "lite_train_lite_infer" ];then
# pretrain lite train data # pretrain lite train data
wget -nc -P ./pretrain_models/ https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/MobileNetV3_large_x0_5_pretrained.pdparams --no-check-certificate wget -nc -P ./pretrain_models/ https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/MobileNetV3_large_x0_5_pretrained.pdparams --no-check-certificate
wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/en/det_mv3_db_v2.0_train.tar --no-check-certificate wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/en/det_mv3_db_v2.0_train.tar --no-check-certificate
if [[ ${model_name} =~ "PPOCRv2_det" ]];then if [[ ${model_name} =~ "ch_PP-OCRv2_det" ]];then
wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_distill_train.tar --no-check-certificate wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_distill_train.tar --no-check-certificate
cd ./pretrain_models/ && tar xf ch_PP-OCRv2_det_distill_train.tar && cd ../ cd ./pretrain_models/ && tar xf ch_PP-OCRv2_det_distill_train.tar && cd ../
fi fi
if [[ ${model_name} =~ "PPOCRv3_det" ]];then if [[ ${model_name} =~ "ch_PP-OCRv3_det" ]];then
wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_distill_train.tar --no-check-certificate wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_distill_train.tar --no-check-certificate
cd ./pretrain_models/ && tar xf ch_PP-OCRv3_det_distill_train.tar && cd ../ cd ./pretrain_models/ && tar xf ch_PP-OCRv3_det_distill_train.tar && cd ../
fi fi
...@@ -64,15 +64,15 @@ if [ ${MODE} = "lite_train_lite_infer" ];then ...@@ -64,15 +64,15 @@ if [ ${MODE} = "lite_train_lite_infer" ];then
ln -s ./icdar2015_lite ./icdar2015 ln -s ./icdar2015_lite ./icdar2015
cd ../ cd ../
cd ./inference && tar xf rec_inference.tar && cd ../ cd ./inference && tar xf rec_inference.tar && cd ../
if [ ${model_name} == "ch_PPOCRv2_det" ] || [ ${model_name} == "ch_PPOCRv2_det_PACT" ]; then if [ ${model_name} == "ch_PP-OCRv2_det" ] || [ ${model_name} == "ch_PP-OCRv2_det_PACT" ]; then
wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_det_train.tar --no-check-certificate wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_det_train.tar --no-check-certificate
cd ./pretrain_models/ && tar xf ch_ppocr_server_v2.0_det_train.tar && cd ../ cd ./pretrain_models/ && tar xf ch_ppocr_server_v2.0_det_train.tar && cd ../
fi fi
if [ ${model_name} == "ch_PPOCRv2_rec" ] || [ ${model_name} == "ch_PPOCRv2_rec_PACT" ]; then if [ ${model_name} == "ch_PP-OCRv2_rec" ] || [ ${model_name} == "ch_PP-OCRv2_rec_PACT" ]; then
wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_train.tar --no-check-certificate wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_train.tar --no-check-certificate
cd ./pretrain_models/ && tar xf ch_PP-OCRv2_rec_train.tar && cd ../ cd ./pretrain_models/ && tar xf ch_PP-OCRv2_rec_train.tar && cd ../
fi fi
if [ ${model_name} == "ch_PPOCRv3_rec" ] || [ ${model_name} == "ch_PPOCRv3_rec_PACT" ]; then if [ ${model_name} == "ch_PP-OCRv3_rec" ] || [ ${model_name} == "ch_PP-OCRv3_rec_PACT" ]; then
wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_train.tar --no-check-certificate wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_train.tar --no-check-certificate
cd ./pretrain_models/ && tar xf ch_PP-OCRv3_rec_train.tar && cd ../ cd ./pretrain_models/ && tar xf ch_PP-OCRv3_rec_train.tar && cd ../
fi fi
...@@ -115,11 +115,11 @@ elif [ ${MODE} = "whole_train_whole_infer" ];then ...@@ -115,11 +115,11 @@ elif [ ${MODE} = "whole_train_whole_infer" ];then
wget -nc -P ./train_data/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/icdar2015.tar --no-check-certificate wget -nc -P ./train_data/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/icdar2015.tar --no-check-certificate
wget -nc -P ./train_data/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ic15_data.tar --no-check-certificate wget -nc -P ./train_data/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ic15_data.tar --no-check-certificate
cd ./train_data/ && tar xf icdar2015.tar && tar xf ic15_data.tar && cd ../ cd ./train_data/ && tar xf icdar2015.tar && tar xf ic15_data.tar && cd ../
if [ ${model_name} == "ch_PPOCRv2_det" ]; then if [ ${model_name} == "ch_PP-OCRv2_det" ]; then
wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_distill_train.tar --no-check-certificate wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_distill_train.tar --no-check-certificate
cd ./pretrain_models/ && tar xf ch_PP-OCRv2_det_distill_train.tar && cd ../ cd ./pretrain_models/ && tar xf ch_PP-OCRv2_det_distill_train.tar && cd ../
fi fi
if [ ${model_name} == "ch_PPOCRv3_det" ]; then if [ ${model_name} == "ch_PP-OCRv3_det" ]; then
wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_distill_train.tar --no-check-certificate wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_distill_train.tar --no-check-certificate
cd ./pretrain_models/ && tar xf ch_PP-OCRv3_det_distill_train.tar && cd ../ cd ./pretrain_models/ && tar xf ch_PP-OCRv3_det_distill_train.tar && cd ../
fi fi
...@@ -143,11 +143,11 @@ elif [ ${MODE} = "lite_train_whole_infer" ];then ...@@ -143,11 +143,11 @@ elif [ ${MODE} = "lite_train_whole_infer" ];then
cd ./train_data/ && tar xf icdar2015_infer.tar && tar xf ic15_data.tar cd ./train_data/ && tar xf icdar2015_infer.tar && tar xf ic15_data.tar
ln -s ./icdar2015_infer ./icdar2015 ln -s ./icdar2015_infer ./icdar2015
cd ../ cd ../
if [ ${model_name} == "ch_PPOCRv2_det" ]; then if [ ${model_name} == "ch_PP-OCRv2_det" ]; then
wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_distill_train.tar --no-check-certificate wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_distill_train.tar --no-check-certificate
cd ./pretrain_models/ && tar xf ch_PP-OCRv2_det_distill_train.tar && cd ../ cd ./pretrain_models/ && tar xf ch_PP-OCRv2_det_distill_train.tar && cd ../
fi fi
if [ ${model_name} == "ch_PPOCRv3_det" ]; then if [ ${model_name} == "ch_PP-OCRv3_det" ]; then
wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_distill_train.tar --no-check-certificate wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_distill_train.tar --no-check-certificate
cd ./pretrain_models/ && tar xf ch_PP-OCRv3_det_distill_train.tar && cd ../ cd ./pretrain_models/ && tar xf ch_PP-OCRv3_det_distill_train.tar && cd ../
fi fi
...@@ -195,29 +195,29 @@ elif [ ${MODE} = "whole_infer" ];then ...@@ -195,29 +195,29 @@ elif [ ${MODE} = "whole_infer" ];then
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv3_det_infer.tar && tar xf ch_PP-OCRv3_rec_infer.tar && tar xf ch_det_data_50.tar && cd ../ cd ./inference && tar xf ch_PP-OCRv3_det_infer.tar && tar xf ch_PP-OCRv3_rec_infer.tar && tar xf ch_det_data_50.tar && cd ../
fi fi
if [[ ${model_name} =~ "ch_PPOCRv2_det" ]]; then if [[ ${model_name} =~ "ch_PP-OCRv2_det" ]]; then
eval_model_name="ch_PP-OCRv2_det_infer" eval_model_name="ch_PP-OCRv2_det_infer"
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_infer.tar --no-check-certificate wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_infer.tar --no-check-certificate
cd ./inference && tar xf ${eval_model_name}.tar && tar xf ch_det_data_50.tar && cd ../ cd ./inference && tar xf ${eval_model_name}.tar && tar xf ch_det_data_50.tar && cd ../
fi fi
if [[ ${model_name} =~ "ch_PPOCRv3_det" ]]; then if [[ ${model_name} =~ "ch_PP-OCRv3_det" ]]; then
eval_model_name="ch_PP-OCRv3_det_infer" eval_model_name="ch_PP-OCRv3_det_infer"
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar --no-check-certificate wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar --no-check-certificate
cd ./inference && tar xf ${eval_model_name}.tar && tar xf ch_det_data_50.tar && cd ../ cd ./inference && tar xf ${eval_model_name}.tar && tar xf ch_det_data_50.tar && cd ../
fi fi
if [[ ${model_name} =~ "PPOCRv2_ocr_rec" ]]; then if [[ ${model_name} =~ "ch_PP-OCRv2_rec" ]]; then
eval_model_name="ch_PP-OCRv2_rec_infer" eval_model_name="ch_PP-OCRv2_rec_infer"
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_infer.tar --no-check-certificate wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_infer.tar --no-check-certificate
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_slim_quant_infer.tar --no-check-certificate wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_slim_quant_infer.tar --no-check-certificate
cd ./inference && tar xf ${eval_model_name}.tar && tar xf ch_PP-OCRv2_rec_slim_quant_infer.tar && cd ../ cd ./inference && tar xf ${eval_model_name}.tar && tar xf ch_PP-OCRv2_rec_slim_quant_infer.tar && cd ../
fi fi
if [[ ${model_name} =~ "PPOCRv3_ocr_rec" ]]; then if [[ ${model_name} =~ "ch_PP-OCRv3_rec" ]]; then
eval_model_name="ch_PP-OCRv3_rec_infer" eval_model_name="ch_PP-OCRv3_rec_infer"
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_slim_infer.tar --no-check-certificate wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_slim_infer.tar --no-check-certificate
cd ./inference && tar xf ${eval_model_name}.tar && tar xf ch_PP-OCRv3_rec_slim_infer.tar && cd ../ cd ./inference && tar xf ${eval_model_name}.tar && tar xf ch_PP-OCRv3_rec_slim_infer.tar && cd ../
fi fi
if [[ ${model_name} == "ch_PPOCRv3_rec_PACT" ]]; then if [[ ${model_name} == "ch_PP-OCRv3_rec_PACT" ]]; then
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_slim_infer.tar --no-check-certificate wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_slim_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv3_rec_slim_infer.tar && cd ../ cd ./inference && tar xf ch_PP-OCRv3_rec_slim_infer.tar && cd ../
fi fi
...@@ -320,14 +320,14 @@ if [ ${MODE} = "klquant_whole_infer" ]; then ...@@ -320,14 +320,14 @@ if [ ${MODE} = "klquant_whole_infer" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && tar xf ch_det_data_50.tar && cd ../ cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && tar xf ch_det_data_50.tar && cd ../
fi fi
if [ ${model_name} = "PPOCRv2_ocr_rec_kl" ]; then if [ ${model_name} = "ch_PP-OCRv2_rec_KL" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_infer.tar --no-check-certificate
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate
wget -nc -P ./train_data/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ic15_data.tar --no-check-certificate wget -nc -P ./train_data/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ic15_data.tar --no-check-certificate
cd ./train_data/ && tar xf ic15_data.tar && cd ../ cd ./train_data/ && tar xf ic15_data.tar && cd ../
cd ./inference && tar xf rec_inference.tar && tar xf ch_PP-OCRv2_rec_infer.tar && cd ../ cd ./inference && tar xf rec_inference.tar && tar xf ch_PP-OCRv2_rec_infer.tar && cd ../
fi fi
if [ ${model_name} = "PPOCRv3_ocr_rec_kl" ]; then if [ ${model_name} = "ch_PP-OCRv3_rec_KL" ]; then
# TODO check model link # TODO check model link
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate
...@@ -335,12 +335,12 @@ if [ ${MODE} = "klquant_whole_infer" ]; then ...@@ -335,12 +335,12 @@ if [ ${MODE} = "klquant_whole_infer" ]; then
cd ./train_data/ && tar xf ic15_data.tar && cd ../ cd ./train_data/ && tar xf ic15_data.tar && cd ../
cd ./inference && tar xf rec_inference.tar && tar xf ch_PP-OCRv3_rec_infer.tar && cd ../ cd ./inference && tar xf rec_inference.tar && tar xf ch_PP-OCRv3_rec_infer.tar && cd ../
fi fi
if [ ${model_name} = "PPOCRv2_ocr_det_kl" ]; then if [ ${model_name} = "ch_PP-OCRv2_det_KL" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv2_det_infer.tar && tar xf ch_det_data_50.tar && cd ../ cd ./inference && tar xf ch_PP-OCRv2_det_infer.tar && tar xf ch_det_data_50.tar && cd ../
fi fi
if [ ${model_name} = "PPOCRv3_ocr_det_kl" ]; then if [ ${model_name} = "ch_PP-OCRv3_det_KL" ]; then
# TODO check model link # TODO check model link
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar --no-check-certificate
...@@ -356,28 +356,54 @@ if [ ${MODE} = "klquant_whole_infer" ]; then ...@@ -356,28 +356,54 @@ if [ ${MODE} = "klquant_whole_infer" ]; then
fi fi
if [ ${MODE} = "cpp_infer" ];then if [ ${MODE} = "cpp_infer" ];then
if [ ${model_name} = "ocr_det" ]; then if [ ${model_name} = "ch_ppocr_mobile_v2.0_det" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar --no-check-certificate
cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && tar xf ch_det_data_50.tar && cd ../ cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && tar xf ch_det_data_50.tar && cd ../
elif [ ${model_name} = "ocr_det_v3" ]; then elif [ ${model_name} = "ch_ppocr_mobile_v2.0_rec" ]; then
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_ppocr_mobile_v2.0_rec_infer.tar && tar xf rec_inference.tar && cd ../
elif [ ${model_name} = "ch_ppocr_server_v2.0_det" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_det_infer.tar --no-check-certificate
cd ./inference && tar xf ch_ppocr_server_v2.0_det_infer.tar && tar xf ch_det_data_50.tar && cd ../
elif [ ${model_name} = "ch_ppocr_server_v2.0_rec" ]; then
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_ppocr_server_v2.0_rec_infer.tar && tar xf rec_inference.tar && cd ../
elif [ ${model_name} = "ch_PP-OCRv2_det" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv2_det_infer.tar && tar xf ch_det_data_50.tar && cd ../
elif [ ${model_name} = "ch_PP-OCRv2_rec" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv2_rec_infer.tar && tar xf rec_inference.tar && cd ../
elif [ ${model_name} = "ch_PP-OCRv3_det" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv3_det_infer.tar && tar xf ch_det_data_50.tar && cd ../ cd ./inference && tar xf ch_PP-OCRv3_det_infer.tar && tar xf ch_det_data_50.tar && cd ../
elif [ ${model_name} = "ocr_rec_v3" ]; then elif [ ${model_name} = "ch_PP-OCRv3_rec" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv3_rec_infer.tar && tar xf rec_inference.tar && cd ../ cd ./inference && tar xf ch_PP-OCRv3_rec_infer.tar && tar xf rec_inference.tar && cd ../
elif [ ${model_name} = "ch_ppocr_mobile_v2.0_rec" ]; then elif [ ${model_name} = "ch_ppocr_mobile_v2.0" ]; then
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_ppocr_mobile_v2.0_rec_infer.tar && tar xf rec_inference.tar && cd ../
elif [ ${model_name} = "ocr_system" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_rec_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && tar xf ch_ppocr_mobile_v2.0_rec_infer.tar && tar xf ch_det_data_50.tar && cd ../ cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && tar xf ch_ppocr_mobile_v2.0_rec_infer.tar && tar xf ch_det_data_50.tar && cd ../
elif [ ${model_name} = "ocr_system_v3" ]; then elif [ ${model_name} = "ch_ppocr_server_v2.0" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_ppocr_server_v2.0_det_infer.tar && tar xf ch_ppocr_server_v2.0_rec_infer.tar && tar xf ch_det_data_50.tar && cd ../
elif [ ${model_name} = "ch_PP-OCRv2" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv2_det_infer.tar && tar xf ch_PP-OCRv2_rec_infer.tar && tar xf ch_det_data_50.tar && cd ../
elif [ ${model_name} = "ch_PP-OCRv3" ]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate
...@@ -391,18 +417,30 @@ if [ ${MODE} = "serving_infer" ];then ...@@ -391,18 +417,30 @@ if [ ${MODE} = "serving_infer" ];then
IFS='|' IFS='|'
array=(${python_name_list}) array=(${python_name_list})
python_name=${array[0]} python_name=${array[0]}
${python_name} -m pip install paddle-serving-server-gpu==0.8.3.post101 # ${python_name} -m pip install paddle-serving-server-gpu==0.8.3.post101
${python_name} -m pip install paddle_serving_client==0.8.3 # ${python_name} -m pip install paddle_serving_client==0.8.3
${python_name} -m pip install paddle-serving-app==0.8.3 # ${python_name} -m pip install paddle-serving-app==0.8.3
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar # wget model
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_rec_infer.tar if [[ ${model_name} =~ "ch_ppocr_mobile_v2.0" ]]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_det_infer.tar wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_rec_infer.tar wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_rec_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && tar xf ch_ppocr_mobile_v2.0_rec_infer.tar && cd ../
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar elif [[ ${model_name} =~ "ch_ppocr_server_v2.0" ]]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_ppocr_server_v2.0_det_infer.tar && tar xf ch_ppocr_server_v2.0_rec_infer.tar && cd ../
elif [[ ${model_name} =~ "ch_PP-OCRv2" ]]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv2_det_infer.tar && tar xf ch_PP-OCRv2_rec_infer.tar && cd ../
elif [[ ${model_name} =~ "ch_PP-OCRv3" ]]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv3_det_infer.tar && tar xf ch_PP-OCRv3_rec_infer.tar && cd ../
fi
# wget data
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv3_det_infer.tar && tar xf ch_PP-OCRv3_rec_infer.tar && tar xf rec_inference.tar && cd ../ cd ./inference && tar xf rec_inference.tar && cd ../
cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && tar xf ch_ppocr_mobile_v2.0_rec_infer.tar && tar xf ch_ppocr_server_v2.0_rec_infer.tar && tar xf ch_ppocr_server_v2.0_det_infer.tar && cd ../
fi fi
if [ ${MODE} = "paddle2onnx_infer" ];then if [ ${MODE} = "paddle2onnx_infer" ];then
...@@ -411,16 +449,27 @@ if [ ${MODE} = "paddle2onnx_infer" ];then ...@@ -411,16 +449,27 @@ if [ ${MODE} = "paddle2onnx_infer" ];then
${python_name} -m pip install paddle2onnx ${python_name} -m pip install paddle2onnx
${python_name} -m pip install onnxruntime==1.4.0 ${python_name} -m pip install onnxruntime==1.4.0
# wget model # wget model
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar if [[ ${model_name} =~ "ch_ppocr_mobile_v2.0" ]]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_rec_infer.tar wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_det_infer.tar wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_rec_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_rec_infer.tar cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && tar xf ch_ppocr_mobile_v2.0_rec_infer.tar && cd ../
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar elif [[ ${model_name} =~ "ch_ppocr_server_v2.0" ]]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_server_v2.0_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_ppocr_server_v2.0_det_infer.tar && tar xf ch_ppocr_server_v2.0_rec_infer.tar && cd ../
elif [[ ${model_name} =~ "ch_PP-OCRv2" ]]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv2_det_infer.tar && tar xf ch_PP-OCRv2_rec_infer.tar && cd ../
elif [[ ${model_name} =~ "ch_PP-OCRv3" ]]; then
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar --no-check-certificate
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar --no-check-certificate
cd ./inference && tar xf ch_PP-OCRv3_det_infer.tar && tar xf ch_PP-OCRv3_rec_infer.tar && cd ../ cd ./inference && tar xf ch_PP-OCRv3_det_infer.tar && tar xf ch_PP-OCRv3_rec_infer.tar && cd ../
fi
# wget data # wget data
wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar wget -nc -P ./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar
wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar wget -nc -P ./inference/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar
cd ./inference && tar xf ch_ppocr_mobile_v2.0_det_infer.tar && tar xf ch_ppocr_mobile_v2.0_rec_infer.tar && tar xf ch_ppocr_server_v2.0_rec_infer.tar && tar xf ch_ppocr_server_v2.0_det_infer.tar && tar xf ch_det_data_50.tar && tar xf rec_inference.tar && cd ../ cd ./inference && tar xf ch_det_data_50.tar && tar xf rec_inference.tar && cd ../
fi fi
...@@ -43,7 +43,7 @@ cpp_cls_value=$(func_parser_value "${lines[18]}") ...@@ -43,7 +43,7 @@ cpp_cls_value=$(func_parser_value "${lines[18]}")
cpp_use_angle_cls_key=$(func_parser_key "${lines[19]}") cpp_use_angle_cls_key=$(func_parser_key "${lines[19]}")
cpp_use_angle_cls_value=$(func_parser_value "${lines[19]}") cpp_use_angle_cls_value=$(func_parser_value "${lines[19]}")
LOG_PATH="./test_tipc/output/cpp_infer/${model_name}" LOG_PATH="./test_tipc/output/${model_name}/cpp_infer"
mkdir -p ${LOG_PATH} mkdir -p ${LOG_PATH}
status_log="${LOG_PATH}/results_cpp.log" status_log="${LOG_PATH}/results_cpp.log"
...@@ -84,7 +84,7 @@ function func_cpp_inference(){ ...@@ -84,7 +84,7 @@ function func_cpp_inference(){
eval $command eval $command
last_status=${PIPESTATUS[0]} last_status=${PIPESTATUS[0]}
eval "cat ${_save_log_path}" eval "cat ${_save_log_path}"
status_check $last_status "${command}" "${status_log}" status_check $last_status "${command}" "${status_log}" "${model_name}"
done done
done done
done done
...@@ -117,7 +117,7 @@ function func_cpp_inference(){ ...@@ -117,7 +117,7 @@ function func_cpp_inference(){
eval $command eval $command
last_status=${PIPESTATUS[0]} last_status=${PIPESTATUS[0]}
eval "cat ${_save_log_path}" eval "cat ${_save_log_path}"
status_check $last_status "${command}" "${status_log}" status_check $last_status "${command}" "${status_log}" "${model_name}"
done done
done done
...@@ -178,6 +178,19 @@ if [ ${use_opencv} = "True" ]; then ...@@ -178,6 +178,19 @@ if [ ${use_opencv} = "True" ]; then
else else
OPENCV_DIR='' OPENCV_DIR=''
fi fi
if [ -d "paddle_inference/" ] ;then
echo "################### download paddle inference skipped ###################"
else
echo "################### download paddle inference ###################"
PADDLEInfer=$3
if [ "" = "$PADDLEInfer" ];then
wget -nc https://paddle-inference-lib.bj.bcebos.com/2.3.0/cxx_c/Linux/GPU/x86-64_gcc8.2_avx_mkl_cuda10.1_cudnn7.6.5_trt6.0.1.5/paddle_inference.tgz --no-check-certificate
else
wget -nc $PADDLEInfer --no-check-certificate
fi
tar zxf paddle_inference.tgz
echo "################### download paddle inference finished ###################"
fi
LIB_DIR=$(pwd)/paddle_inference/ LIB_DIR=$(pwd)/paddle_inference/
CUDA_LIB_DIR=$(dirname `find /usr -name libcudart.so`) CUDA_LIB_DIR=$(dirname `find /usr -name libcudart.so`)
CUDNN_LIB_DIR=$(dirname `find /usr -name libcudnn.so`) CUDNN_LIB_DIR=$(dirname `find /usr -name libcudnn.so`)
......
...@@ -11,7 +11,7 @@ python=$(func_parser_value "${lines[2]}") ...@@ -11,7 +11,7 @@ python=$(func_parser_value "${lines[2]}")
# parser params # parser params
dataline=$(awk 'NR==1, NR==14{print}' $FILENAME) dataline=$(awk 'NR==1, NR==17{print}' $FILENAME)
IFS=$'\n' IFS=$'\n'
lines=(${dataline}) lines=(${dataline})
...@@ -19,27 +19,32 @@ lines=(${dataline}) ...@@ -19,27 +19,32 @@ lines=(${dataline})
model_name=$(func_parser_value "${lines[1]}") model_name=$(func_parser_value "${lines[1]}")
python=$(func_parser_value "${lines[2]}") python=$(func_parser_value "${lines[2]}")
padlle2onnx_cmd=$(func_parser_value "${lines[3]}") padlle2onnx_cmd=$(func_parser_value "${lines[3]}")
infer_model_dir_key=$(func_parser_key "${lines[4]}") det_infer_model_dir_key=$(func_parser_key "${lines[4]}")
infer_model_dir_value=$(func_parser_value "${lines[4]}") det_infer_model_dir_value=$(func_parser_value "${lines[4]}")
model_filename_key=$(func_parser_key "${lines[5]}") model_filename_key=$(func_parser_key "${lines[5]}")
model_filename_value=$(func_parser_value "${lines[5]}") model_filename_value=$(func_parser_value "${lines[5]}")
params_filename_key=$(func_parser_key "${lines[6]}") params_filename_key=$(func_parser_key "${lines[6]}")
params_filename_value=$(func_parser_value "${lines[6]}") params_filename_value=$(func_parser_value "${lines[6]}")
save_file_key=$(func_parser_key "${lines[7]}") det_save_file_key=$(func_parser_key "${lines[7]}")
save_file_value=$(func_parser_value "${lines[7]}") det_save_file_value=$(func_parser_value "${lines[7]}")
opset_version_key=$(func_parser_key "${lines[8]}") rec_infer_model_dir_key=$(func_parser_key "${lines[8]}")
opset_version_value=$(func_parser_value "${lines[8]}") rec_infer_model_dir_value=$(func_parser_value "${lines[8]}")
enable_onnx_checker_key=$(func_parser_key "${lines[9]}") rec_save_file_key=$(func_parser_key "${lines[9]}")
enable_onnx_checker_value=$(func_parser_value "${lines[9]}") rec_save_file_value=$(func_parser_value "${lines[9]}")
opset_version_key=$(func_parser_key "${lines[10]}")
opset_version_value=$(func_parser_value "${lines[10]}")
enable_onnx_checker_key=$(func_parser_key "${lines[11]}")
enable_onnx_checker_value=$(func_parser_value "${lines[11]}")
# parser onnx inference # parser onnx inference
inference_py=$(func_parser_value "${lines[10]}") inference_py=$(func_parser_value "${lines[12]}")
use_gpu_key=$(func_parser_key "${lines[11]}") use_gpu_key=$(func_parser_key "${lines[13]}")
use_gpu_list=$(func_parser_value "${lines[11]}") use_gpu_list=$(func_parser_value "${lines[13]}")
det_model_key=$(func_parser_key "${lines[12]}") det_model_key=$(func_parser_key "${lines[14]}")
image_dir_key=$(func_parser_key "${lines[13]}") rec_model_key=$(func_parser_key "${lines[15]}")
image_dir_value=$(func_parser_value "${lines[13]}") image_dir_key=$(func_parser_key "${lines[16]}")
image_dir_value=$(func_parser_value "${lines[16]}")
LOG_PATH="./test_tipc/output/paddle2onnx/${model_name}" LOG_PATH="./test_tipc/output/${model_name}/paddle2onnx"
mkdir -p ${LOG_PATH} mkdir -p ${LOG_PATH}
status_log="${LOG_PATH}/results_paddle2onnx.log" status_log="${LOG_PATH}/results_paddle2onnx.log"
...@@ -49,38 +54,95 @@ function func_paddle2onnx(){ ...@@ -49,38 +54,95 @@ function func_paddle2onnx(){
_script=$1 _script=$1
# paddle2onnx # paddle2onnx
set_dirname=$(func_set_params "${infer_model_dir_key}" "${infer_model_dir_value}") if [ ${model_name} = "ch_PP-OCRv2" ] || [ ${model_name} = "ch_PP-OCRv3" ] || [ ${model_name} = "ch_ppocr_mobile_v2.0" ] || [ ${model_name} = "ch_ppocr_server_v2.0" ]; then
# trans det
set_dirname=$(func_set_params "--model_dir" "${det_infer_model_dir_value}")
set_model_filename=$(func_set_params "${model_filename_key}" "${model_filename_value}") set_model_filename=$(func_set_params "${model_filename_key}" "${model_filename_value}")
set_params_filename=$(func_set_params "${params_filename_key}" "${params_filename_value}") set_params_filename=$(func_set_params "${params_filename_key}" "${params_filename_value}")
set_save_model=$(func_set_params "${save_file_key}" "${save_file_value}") set_save_model=$(func_set_params "--save_file" "${det_save_file_value}")
set_opset_version=$(func_set_params "${opset_version_key}" "${opset_version_value}") set_opset_version=$(func_set_params "${opset_version_key}" "${opset_version_value}")
set_enable_onnx_checker=$(func_set_params "${enable_onnx_checker_key}" "${enable_onnx_checker_value}") set_enable_onnx_checker=$(func_set_params "${enable_onnx_checker_key}" "${enable_onnx_checker_value}")
trans_model_cmd="${padlle2onnx_cmd} ${set_dirname} ${set_model_filename} ${set_params_filename} ${set_save_model} ${set_opset_version} ${set_enable_onnx_checker}" trans_model_cmd="${padlle2onnx_cmd} ${set_dirname} ${set_model_filename} ${set_params_filename} ${set_save_model} ${set_opset_version} ${set_enable_onnx_checker}"
eval $trans_model_cmd eval $trans_model_cmd
last_status=${PIPESTATUS[0]} last_status=${PIPESTATUS[0]}
status_check $last_status "${trans_model_cmd}" "${status_log}" status_check $last_status "${trans_model_cmd}" "${status_log}" "${model_name}"
# trans rec
set_dirname=$(func_set_params "--model_dir" "${rec_infer_model_dir_value}")
set_model_filename=$(func_set_params "${model_filename_key}" "${model_filename_value}")
set_params_filename=$(func_set_params "${params_filename_key}" "${params_filename_value}")
set_save_model=$(func_set_params "--save_file" "${rec_save_file_value}")
set_opset_version=$(func_set_params "${opset_version_key}" "${opset_version_value}")
set_enable_onnx_checker=$(func_set_params "${enable_onnx_checker_key}" "${enable_onnx_checker_value}")
trans_model_cmd="${padlle2onnx_cmd} ${set_dirname} ${set_model_filename} ${set_params_filename} ${set_save_model} ${set_opset_version} ${set_enable_onnx_checker}"
eval $trans_model_cmd
last_status=${PIPESTATUS[0]}
status_check $last_status "${trans_model_cmd}" "${status_log}" "${model_name}"
elif [ ${model_name} = "ch_PP-OCRv2_det" ] || [ ${model_name} = "ch_PP-OCRv3_det" ] || [ ${model_name} = "ch_ppocr_mobile_v2.0_det" ] || [ ${model_name} = "ch_ppocr_server_v2.0_det" ]; then
# trans det
set_dirname=$(func_set_params "--model_dir" "${det_infer_model_dir_value}")
set_model_filename=$(func_set_params "${model_filename_key}" "${model_filename_value}")
set_params_filename=$(func_set_params "${params_filename_key}" "${params_filename_value}")
set_save_model=$(func_set_params "--save_file" "${det_save_file_value}")
set_opset_version=$(func_set_params "${opset_version_key}" "${opset_version_value}")
set_enable_onnx_checker=$(func_set_params "${enable_onnx_checker_key}" "${enable_onnx_checker_value}")
trans_model_cmd="${padlle2onnx_cmd} ${set_dirname} ${set_model_filename} ${set_params_filename} ${set_save_model} ${set_opset_version} ${set_enable_onnx_checker}"
eval $trans_model_cmd
last_status=${PIPESTATUS[0]}
status_check $last_status "${trans_model_cmd}" "${status_log}" "${model_name}"
elif [ ${model_name} = "ch_PP-OCRv2_rec" ] || [ ${model_name} = "ch_PP-OCRv3_rec" ] || [ ${model_name} = "ch_ppocr_mobile_v2.0_rec" ] || [ ${model_name} = "ch_ppocr_server_v2.0_rec" ]; then
# trans rec
set_dirname=$(func_set_params "--model_dir" "${rec_infer_model_dir_value}")
set_model_filename=$(func_set_params "${model_filename_key}" "${model_filename_value}")
set_params_filename=$(func_set_params "${params_filename_key}" "${params_filename_value}")
set_save_model=$(func_set_params "--save_file" "${rec_save_file_value}")
set_opset_version=$(func_set_params "${opset_version_key}" "${opset_version_value}")
set_enable_onnx_checker=$(func_set_params "${enable_onnx_checker_key}" "${enable_onnx_checker_value}")
trans_model_cmd="${padlle2onnx_cmd} ${set_dirname} ${set_model_filename} ${set_params_filename} ${set_save_model} ${set_opset_version} ${set_enable_onnx_checker}"
eval $trans_model_cmd
last_status=${PIPESTATUS[0]}
status_check $last_status "${trans_model_cmd}" "${status_log}" "${model_name}"
fi
# python inference # python inference
for use_gpu in ${use_gpu_list[*]}; do for use_gpu in ${use_gpu_list[*]}; do
if [ ${use_gpu} = "False" ] || [ ${use_gpu} = "cpu" ]; then if [ ${use_gpu} = "False" ] || [ ${use_gpu} = "cpu" ]; then
_save_log_path="${LOG_PATH}/paddle2onnx_infer_cpu.log" _save_log_path="${LOG_PATH}/paddle2onnx_infer_cpu.log"
set_gpu=$(func_set_params "${use_gpu_key}" "${use_gpu}") set_gpu=$(func_set_params "${use_gpu_key}" "${use_gpu}")
set_model_dir=$(func_set_params "${det_model_key}" "${save_file_value}")
set_img_dir=$(func_set_params "${image_dir_key}" "${image_dir_value}") set_img_dir=$(func_set_params "${image_dir_key}" "${image_dir_value}")
infer_model_cmd="${python} ${inference_py} ${set_gpu} ${set_img_dir} ${set_model_dir} --use_onnx=True > ${_save_log_path} 2>&1 " if [ ${model_name} = "ch_PP-OCRv2" ] || [ ${model_name} = "ch_PP-OCRv3" ] || [ ${model_name} = "ch_ppocr_mobile_v2.0" ] || [ ${model_name} = "ch_ppocr_server_v2.0" ]; then
set_det_model_dir=$(func_set_params "${det_model_key}" "${det_save_file_value}")
set_rec_model_dir=$(func_set_params "${rec_model_key}" "${rec_save_file_value}")
infer_model_cmd="${python} ${inference_py} ${set_gpu} ${set_img_dir} ${set_det_model_dir} ${set_rec_model_dir} --use_onnx=True > ${_save_log_path} 2>&1 "
elif [ ${model_name} = "ch_PP-OCRv2_det" ] || [ ${model_name} = "ch_PP-OCRv3_det" ] || [ ${model_name} = "ch_ppocr_mobile_v2.0_det" ] || [ ${model_name} = "ch_ppocr_server_v2.0_det" ]; then
set_det_model_dir=$(func_set_params "${det_model_key}" "${det_save_file_value}")
infer_model_cmd="${python} ${inference_py} ${set_gpu} ${set_img_dir} ${set_det_model_dir} --use_onnx=True > ${_save_log_path} 2>&1 "
elif [ ${model_name} = "ch_PP-OCRv2_rec" ] || [ ${model_name} = "ch_PP-OCRv3_rec" ] || [ ${model_name} = "ch_ppocr_mobile_v2.0_rec" ] || [ ${model_name} = "ch_ppocr_server_v2.0_rec" ]; then
set_rec_model_dir=$(func_set_params "${rec_model_key}" "${rec_save_file_value}")
infer_model_cmd="${python} ${inference_py} ${set_gpu} ${set_img_dir} ${set_rec_model_dir} --use_onnx=True > ${_save_log_path} 2>&1 "
fi
eval $infer_model_cmd eval $infer_model_cmd
last_status=${PIPESTATUS[0]} last_status=${PIPESTATUS[0]}
eval "cat ${_save_log_path}" eval "cat ${_save_log_path}"
status_check $last_status "${infer_model_cmd}" "${status_log}" status_check $last_status "${infer_model_cmd}" "${status_log}" "${model_name}"
elif [ ${use_gpu} = "True" ] || [ ${use_gpu} = "gpu" ]; then elif [ ${use_gpu} = "True" ] || [ ${use_gpu} = "gpu" ]; then
_save_log_path="${LOG_PATH}/paddle2onnx_infer_gpu.log" _save_log_path="${LOG_PATH}/paddle2onnx_infer_gpu.log"
set_gpu=$(func_set_params "${use_gpu_key}" "${use_gpu}") set_gpu=$(func_set_params "${use_gpu_key}" "${use_gpu}")
set_model_dir=$(func_set_params "${det_model_key}" "${save_file_value}")
set_img_dir=$(func_set_params "${image_dir_key}" "${image_dir_value}") set_img_dir=$(func_set_params "${image_dir_key}" "${image_dir_value}")
infer_model_cmd="${python} ${inference_py} ${set_gpu} ${set_img_dir} ${set_model_dir} --use_onnx=True > ${_save_log_path} 2>&1 " if [ ${model_name} = "ch_PP-OCRv2" ] || [ ${model_name} = "ch_PP-OCRv3" ] || [ ${model_name} = "ch_ppocr_mobile_v2.0" ] || [ ${model_name} = "ch_ppocr_server_v2.0" ]; then
set_det_model_dir=$(func_set_params "${det_model_key}" "${det_save_file_value}")
set_rec_model_dir=$(func_set_params "${rec_model_key}" "${rec_save_file_value}")
infer_model_cmd="${python} ${inference_py} ${set_gpu} ${set_img_dir} ${set_det_model_dir} ${set_rec_model_dir} --use_onnx=True > ${_save_log_path} 2>&1 "
elif [ ${model_name} = "ch_PP-OCRv2_det" ] || [ ${model_name} = "ch_PP-OCRv3_det" ] || [ ${model_name} = "ch_ppocr_mobile_v2.0_det" ] || [ ${model_name} = "ch_ppocr_server_v2.0_det" ]; then
set_det_model_dir=$(func_set_params "${det_model_key}" "${det_save_file_value}")
infer_model_cmd="${python} ${inference_py} ${set_gpu} ${set_img_dir} ${set_det_model_dir} --use_onnx=True > ${_save_log_path} 2>&1 "
elif [ ${model_name} = "ch_PP-OCRv2_rec" ] || [ ${model_name} = "ch_PP-OCRv3_rec" ] || [ ${model_name} = "ch_ppocr_mobile_v2.0_rec" ] || [ ${model_name} = "ch_ppocr_server_v2.0_rec" ]; then
set_rec_model_dir=$(func_set_params "${rec_model_key}" "${rec_save_file_value}")
infer_model_cmd="${python} ${inference_py} ${set_gpu} ${set_img_dir} ${set_rec_model_dir} --use_onnx=True > ${_save_log_path} 2>&1 "
fi
eval $infer_model_cmd eval $infer_model_cmd
last_status=${PIPESTATUS[0]} last_status=${PIPESTATUS[0]}
eval "cat ${_save_log_path}" eval "cat ${_save_log_path}"
status_check $last_status "${infer_model_cmd}" "${status_log}" status_check $last_status "${infer_model_cmd}" "${status_log}" "${model_name}"
else else
echo "Does not support hardware other than CPU and GPU Currently!" echo "Does not support hardware other than CPU and GPU Currently!"
fi fi
......
...@@ -142,9 +142,9 @@ function func_inference(){ ...@@ -142,9 +142,9 @@ function func_inference(){
for use_gpu in ${use_gpu_list[*]}; do for use_gpu in ${use_gpu_list[*]}; do
if [ ${use_gpu} = "False" ] || [ ${use_gpu} = "cpu" ]; then if [ ${use_gpu} = "False" ] || [ ${use_gpu} = "cpu" ]; then
for use_mkldnn in ${use_mkldnn_list[*]}; do for use_mkldnn in ${use_mkldnn_list[*]}; do
if [ ${use_mkldnn} = "False" ] && [ ${_flag_quant} = "True" ]; then # if [ ${use_mkldnn} = "False" ] && [ ${_flag_quant} = "True" ]; then
continue # continue
fi # fi
for threads in ${cpu_threads_list[*]}; do for threads in ${cpu_threads_list[*]}; do
for batch_size in ${batch_size_list[*]}; do for batch_size in ${batch_size_list[*]}; do
for precision in ${precision_list[*]}; do for precision in ${precision_list[*]}; do
...@@ -169,7 +169,7 @@ function func_inference(){ ...@@ -169,7 +169,7 @@ function func_inference(){
eval $command eval $command
last_status=${PIPESTATUS[0]} last_status=${PIPESTATUS[0]}
eval "cat ${_save_log_path}" eval "cat ${_save_log_path}"
status_check $last_status "${command}" "${status_log}" status_check $last_status "${command}" "${status_log}" "${model_name}"
done done
done done
done done
...@@ -200,7 +200,7 @@ function func_inference(){ ...@@ -200,7 +200,7 @@ function func_inference(){
eval $command eval $command
last_status=${PIPESTATUS[0]} last_status=${PIPESTATUS[0]}
eval "cat ${_save_log_path}" eval "cat ${_save_log_path}"
status_check $last_status "${command}" "${status_log}" status_check $last_status "${command}" "${status_log}" "${model_name}"
done done
done done
...@@ -240,7 +240,7 @@ if [ ${MODE} = "whole_infer" ] || [ ${MODE} = "klquant_whole_infer" ]; then ...@@ -240,7 +240,7 @@ if [ ${MODE} = "whole_infer" ] || [ ${MODE} = "klquant_whole_infer" ]; then
echo $export_cmd echo $export_cmd
eval $export_cmd eval $export_cmd
status_export=$? status_export=$?
status_check $status_export "${export_cmd}" "${status_log}" status_check $status_export "${export_cmd}" "${status_log}" "${model_name}"
else else
save_infer_dir=${infer_model} save_infer_dir=${infer_model}
fi fi
...@@ -337,7 +337,7 @@ else ...@@ -337,7 +337,7 @@ else
fi fi
# run train # run train
eval $cmd eval $cmd
status_check $? "${cmd}" "${status_log}" status_check $? "${cmd}" "${status_log}" "${model_name}"
set_eval_pretrain=$(func_set_params "${pretrain_model_key}" "${save_log}/${train_model_name}") set_eval_pretrain=$(func_set_params "${pretrain_model_key}" "${save_log}/${train_model_name}")
...@@ -347,7 +347,7 @@ else ...@@ -347,7 +347,7 @@ else
set_eval_params1=$(func_set_params "${eval_key1}" "${eval_value1}") set_eval_params1=$(func_set_params "${eval_key1}" "${eval_value1}")
eval_cmd="${python} ${eval_py} ${set_eval_pretrain} ${set_use_gpu} ${set_eval_params1}" eval_cmd="${python} ${eval_py} ${set_eval_pretrain} ${set_use_gpu} ${set_eval_params1}"
eval $eval_cmd eval $eval_cmd
status_check $? "${eval_cmd}" "${status_log}" status_check $? "${eval_cmd}" "${status_log}" "${model_name}"
fi fi
# run export model # run export model
if [ ${run_export} != "null" ]; then if [ ${run_export} != "null" ]; then
...@@ -357,7 +357,7 @@ else ...@@ -357,7 +357,7 @@ else
set_save_infer_key=$(func_set_params "${save_infer_key}" "${save_infer_path}") set_save_infer_key=$(func_set_params "${save_infer_key}" "${save_infer_path}")
export_cmd="${python} ${run_export} ${set_export_weight} ${set_save_infer_key}" export_cmd="${python} ${run_export} ${set_export_weight} ${set_save_infer_key}"
eval $export_cmd eval $export_cmd
status_check $? "${export_cmd}" "${status_log}" status_check $? "${export_cmd}" "${status_log}" "${model_name}"
#run inference #run inference
eval $env eval $env
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册