diff --git a/deploy/hubserving/ocr_rec/params.py b/deploy/hubserving/ocr_rec/params.py index 58a8bc119e2a54ad78446bd616eeb7a9089a6084..4c7666cf670a158f9365f9f05f81fef9bf40d44c 100644 --- a/deploy/hubserving/ocr_rec/params.py +++ b/deploy/hubserving/ocr_rec/params.py @@ -38,6 +38,14 @@ def read_params(): cfg.rec_char_dict_path = "./ppocr/utils/ppocr_keys_v1.txt" cfg.use_space_char = True + #params for text classifier + cfg.use_angle_cls = True + cfg.cls_model_dir = "./inference/ch_ppocr_mobile_v1.1_cls_infer/" + cfg.cls_image_shape = "3, 48, 192" + cfg.label_list = ['0', '180'] + cfg.cls_batch_num = 30 + cfg.cls_thresh = 0.9 + cfg.use_zero_copy_run = False return cfg diff --git a/deploy/hubserving/ocr_system/params.py b/deploy/hubserving/ocr_system/params.py index d83fe692dca7c94c7225a1aa26e782765e665bdd..da96c9cf59a9d17cb0631cdb3aa5a22b49ebf71b 100644 --- a/deploy/hubserving/ocr_system/params.py +++ b/deploy/hubserving/ocr_system/params.py @@ -39,11 +39,12 @@ def read_params(): cfg.use_space_char = True #params for text classifier - cfg.use_angle_cls = False - cfg.cls_model_dir = "./inference/ch_ppocr_mobile-v1.1.cls_infer/" + cfg.use_angle_cls = True + cfg.cls_model_dir = "./inference/ch_ppocr_mobile_v1.1_cls_infer/" cfg.cls_image_shape = "3, 48, 192" cfg.label_list = ['0', '180'] cfg.cls_batch_num = 30 + cfg.cls_thresh = 0.9 cfg.use_zero_copy_run = False diff --git a/deploy/hubserving/readme.md b/deploy/hubserving/readme.md index 5d29b432ba3d4c098872431c9b5fde13f553eee0..c7068d91b42f1e809190c0e44bc6ac97443c9348 100644 --- a/deploy/hubserving/readme.md +++ b/deploy/hubserving/readme.md @@ -38,8 +38,12 @@ SET PYTHONPATH=. ``` ### 2. 下载推理模型 -安装服务模块前,需要准备推理模型并放到正确路径。默认使用的是v1.1版的超轻量模型,默认检测模型路径为: -`./inference/ch_ppocr_mobile_v1.1_det_infer/`,识别模型路径为:`./inference/ch_ppocr_mobile_v1.1_rec_infer/`。 +安装服务模块前,需要准备推理模型并放到正确路径。默认使用的是v1.1版的超轻量模型,默认模型路径为: +``` +检测模型:./inference/ch_ppocr_mobile_v1.1_det_infer/ +识别模型:./inference/ch_ppocr_mobile_v1.1_rec_infer/ +方向分类器:./inference/ch_ppocr_mobile_v1.1_cls_infer/ +``` **模型路径可在`params.py`中查看和修改。** 更多模型可以从PaddleOCR提供的[模型库](../../doc/doc_ch/models_list.md)下载,也可以替换成自己训练转换好的模型。 @@ -173,7 +177,7 @@ hub serving start -c deploy/hubserving/ocr_system/config.json ```hub serving stop --port/-p XXXX``` - 2、 到相应的`module.py`和`params.py`等文件中根据实际需求修改代码。 -例如,如果需要替换部署服务所用模型,则需要到`params.py`中修改模型路径参数`det_model_dir`和`rec_model_dir`,当然,同时可能还需要修改其他相关参数,请根据实际情况修改调试。 **强烈建议修改后先直接运行`module.py`调试,能正确运行预测后再启动服务测试。** +例如,如果需要替换部署服务所用模型,则需要到`params.py`中修改模型路径参数`det_model_dir`和`rec_model_dir`,如果需要关闭文本方向分类器,则将参数`use_angle_cls`置为`False`,当然,同时可能还需要修改其他相关参数,请根据实际情况修改调试。 **强烈建议修改后先直接运行`module.py`调试,能正确运行预测后再启动服务测试。** - 3、 卸载旧服务包 ```hub uninstall ocr_system``` diff --git a/deploy/hubserving/readme_en.md b/deploy/hubserving/readme_en.md index efef1cda6dd5a91d6ad2f7db27061418fa24e105..48b519ae24199d3778dcf2d2a1dea785aa0503a4 100644 --- a/deploy/hubserving/readme_en.md +++ b/deploy/hubserving/readme_en.md @@ -39,7 +39,12 @@ SET PYTHONPATH=. ``` ### 2. Download inference model -Before installing the service module, you need to prepare the inference model and put it in the correct path. By default, the ultra lightweight model of v1.1 is used, and the default detection model path is: `./inference/ch_ppocr_mobile_v1.1_det_infer/`, the default recognition model path is: `./inference/ch_ppocr_mobile_v1.1_rec_infer/`. +Before installing the service module, you need to prepare the inference model and put it in the correct path. By default, the ultra lightweight model of v1.1 is used, and the default model path is: +``` +detection model: ./inference/ch_ppocr_mobile_v1.1_det_infer/ +recognition model: ./inference/ch_ppocr_mobile_v1.1_rec_infer/ +text direction classifier: ./inference/ch_ppocr_mobile_v1.1_cls_infer/ +``` **The model path can be found and modified in `params.py`.** More models provided by PaddleOCR can be obtained from the [model library](../../doc/doc_en/models_list_en.md). You can also use models trained by yourself. @@ -180,7 +185,7 @@ If you need to modify the service logic, the following steps are generally requi hub serving stop --port/-p XXXX ``` - 2. Modify the code in the corresponding files, like `module.py` and `params.py`, according to the actual needs. -For example, if you need to replace the model used by the deployed service, you need to modify model path parameters `det_model_dir` and `rec_model_dir` in `params.py`. Of course, other related parameters may need to be modified at the same time. Please modify and debug according to the actual situation. It is suggested to run `module.py` directly for debugging after modification before starting the service test. +For example, if you need to replace the model used by the deployed service, you need to modify model path parameters `det_model_dir` and `rec_model_dir` in `params.py`. If you want to turn off the text direction classifier, set the parameter `use_angle_cls` to `False`. Of course, other related parameters may need to be modified at the same time. Please modify and debug according to the actual situation. It is suggested to run `module.py` directly for debugging after modification before starting the service test. - 3. Uninstall old service module ```shell hub uninstall ocr_system