Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
weixin_41840029
PaddleOCR
提交
c01e526f
P
PaddleOCR
项目概览
weixin_41840029
/
PaddleOCR
与 Fork 源项目一致
Fork自
PaddlePaddle / PaddleOCR
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleOCR
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
c01e526f
编写于
3月 10, 2022
作者:
T
tink2123
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
fix tipc serving doc
上级
96e5cd51
变更
6
显示空白变更内容
内联
并排
Showing
6 changed file
with
72 addition
and
19 deletion
+72
-19
deploy/pdserving/README.md
deploy/pdserving/README.md
+6
-6
deploy/pdserving/README_CN.md
deploy/pdserving/README_CN.md
+7
-7
deploy/pdserving/config.yml
deploy/pdserving/config.yml
+2
-2
deploy/pdserving/web_service_det.py
deploy/pdserving/web_service_det.py
+52
-1
deploy/pdserving/web_service_rec.py
deploy/pdserving/web_service_rec.py
+3
-1
test_tipc/docs/test_serving.md
test_tipc/docs/test_serving.md
+2
-2
未找到文件。
deploy/pdserving/README.md
浏览文件 @
c01e526f
...
...
@@ -78,27 +78,27 @@ Then, you can use installed paddle_serving_client tool to convert inference mode
python3 -m paddle_serving_client.convert --dirname ./ch_PP-OCRv2_det_infer/ \
--model_filename inference.pdmodel \
--params_filename inference.pdiparams \
--serving_server ./ppocr
v2_det
_serving/ \
--serving_client ./ppocr
v2_det
_client/
--serving_server ./ppocr
_det_mobile_2.0
_serving/ \
--serving_client ./ppocr
_det_mobile_2.0
_client/
# Recognition model conversion
python3 -m paddle_serving_client.convert --dirname ./ch_PP-OCRv2_rec_infer/ \
--model_filename inference.pdmodel \
--params_filename inference.pdiparams \
--serving_server ./ppocr
v2_rec
_serving/ \
--serving_client ./ppocr
v2_rec
_client/
--serving_server ./ppocr
_rec_mobile_2.0
_serving/ \
--serving_client ./ppocr
_rec_mobile_2.0
_client/
```
After the detection model is converted, there will be additional folders of
`ppocr_det_mobile_2.0_serving`
and
`ppocr_det_mobile_2.0_client`
in the current folder, with the following format:
```
|- ppocr
v2_det
_serving/
|- ppocr
_det_mobile_2.0
_serving/
|- __model__
|- __params__
|- serving_server_conf.prototxt
|- serving_server_conf.stream.prototxt
|- ppocr
v2_det
_client
|- ppocr
_det_mobile_2.0
_client
|- serving_client_conf.prototxt
|- serving_client_conf.stream.prototxt
...
...
deploy/pdserving/README_CN.md
浏览文件 @
c01e526f
...
...
@@ -75,26 +75,26 @@ wget https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_rec_infer.tar
python3
-m
paddle_serving_client.convert
--dirname
./ch_PP-OCRv2_det_infer/
\
--model_filename
inference.pdmodel
\
--params_filename
inference.pdiparams
\
--serving_server
./ppocr
v2_det
_serving/
\
--serving_client
./ppocr
v2_det
_client/
--serving_server
./ppocr
_det_mobile_2.0
_serving/
\
--serving_client
./ppocr
_det_mobile_2.0
_client/
# 转换识别模型
python3
-m
paddle_serving_client.convert
--dirname
./ch_PP-OCRv2_rec_infer/
\
--model_filename
inference.pdmodel
\
--params_filename
inference.pdiparams
\
--serving_server
./ppocr
v2_rec
_serving/
\
--serving_client
./ppocr
v2_rec
_client/
--serving_server
./ppocr
_rec_mobile_2.0
_serving/
\
--serving_client
./ppocr
_rec_mobile_2.0
_client/
```
检测模型转换完成后,会在当前文件夹多出
`ppocr
v2_det_serving`
和
`ppocrv2_det
_client`
的文件夹,具备如下格式:
检测模型转换完成后,会在当前文件夹多出
`ppocr
_det_mobile_2.0_serving`
和
`ppocr_det_mobile_2.0
_client`
的文件夹,具备如下格式:
```
|- ppocr
v2_det
_serving/
|- ppocr
_det_mobile_2.0
_serving/
|- __model__
|- __params__
|- serving_server_conf.prototxt
|- serving_server_conf.stream.prototxt
|- ppocr
v2_det
_client
|- ppocr
_det_mobile_2.0
_client
|- serving_client_conf.prototxt
|- serving_client_conf.stream.prototxt
...
...
deploy/pdserving/config.yml
浏览文件 @
c01e526f
...
...
@@ -34,7 +34,7 @@ op:
client_type
:
local_predictor
#det模型路径
model_config
:
./ppocr
v2_det
_serving
model_config
:
./ppocr
_det_mobile_2.0
_serving
#Fetch结果列表,以client_config中fetch_var的alias_name为准
fetch_list
:
[
"
save_infer_model/scale_0.tmp_1"
]
...
...
@@ -60,7 +60,7 @@ op:
client_type
:
local_predictor
#rec模型路径
model_config
:
./ppocr
v2_rec
_serving
model_config
:
./ppocr
_rec_mobile_2.0
_serving
#Fetch结果列表,以client_config中fetch_var的alias_name为准
fetch_list
:
[
"
save_infer_model/scale_0.tmp_1"
]
...
...
deploy/pdserving/web_service_det.py
浏览文件 @
c01e526f
...
...
@@ -23,8 +23,58 @@ from paddle_serving_app.reader import Sequential, ResizeByFactor
from
paddle_serving_app.reader
import
Div
,
Normalize
,
Transpose
from
paddle_serving_app.reader
import
DBPostProcess
,
FilterBoxes
,
GetRotateCropImage
,
SortedBoxes
import
yaml
from
argparse
import
ArgumentParser
,
RawDescriptionHelpFormatter
_LOGGER
=
logging
.
getLogger
()
class
ArgsParser
(
ArgumentParser
):
def
__init__
(
self
):
super
(
ArgsParser
,
self
).
__init__
(
formatter_class
=
RawDescriptionHelpFormatter
)
self
.
add_argument
(
"-c"
,
"--config"
,
help
=
"configuration file to use"
)
self
.
add_argument
(
"-o"
,
"--opt"
,
nargs
=
'+'
,
help
=
"set configuration options"
)
def
parse_args
(
self
,
argv
=
None
):
args
=
super
(
ArgsParser
,
self
).
parse_args
(
argv
)
assert
args
.
config
is
not
None
,
\
"Please specify --config=configure_file_path."
args
.
conf_dict
=
self
.
_parse_opt
(
args
.
opt
,
args
.
config
)
return
args
def
_parse_helper
(
self
,
v
):
if
v
.
isnumeric
():
if
"."
in
v
:
v
=
float
(
v
)
else
:
v
=
int
(
v
)
elif
v
==
"True"
or
v
==
"False"
:
v
=
(
v
==
"True"
)
return
v
def
_parse_opt
(
self
,
opts
,
conf_path
):
f
=
open
(
conf_path
)
config
=
yaml
.
load
(
f
,
Loader
=
yaml
.
Loader
)
if
not
opts
:
return
config
for
s
in
opts
:
s
=
s
.
strip
()
k
,
v
=
s
.
split
(
'='
)
v
=
self
.
_parse_helper
(
v
)
print
(
k
,
v
,
type
(
v
))
cur
=
config
parent
=
cur
for
kk
in
k
.
split
(
"."
):
if
kk
not
in
cur
:
cur
[
kk
]
=
{}
parent
=
cur
cur
=
cur
[
kk
]
else
:
parent
=
cur
cur
=
cur
[
kk
]
parent
[
k
.
split
(
"."
)[
-
1
]]
=
v
return
config
class
DetOp
(
Op
):
def
init_op
(
self
):
...
...
@@ -73,5 +123,6 @@ class OcrService(WebService):
uci_service
=
OcrService
(
name
=
"ocr"
)
uci_service
.
prepare_pipeline_config
(
"config.yml"
)
FLAGS
=
ArgsParser
().
parse_args
()
uci_service
.
prepare_pipeline_config
(
yml_dict
=
FLAGS
.
conf_dict
)
uci_service
.
run_service
()
deploy/pdserving/web_service_rec.py
浏览文件 @
c01e526f
...
...
@@ -21,6 +21,7 @@ import base64
from
ocr_reader
import
OCRReader
,
DetResizeForTest
from
paddle_serving_app.reader
import
Sequential
,
ResizeByFactor
from
paddle_serving_app.reader
import
Div
,
Normalize
,
Transpose
from
web_service_det
import
ArgsParser
_LOGGER
=
logging
.
getLogger
()
...
...
@@ -82,5 +83,6 @@ class OcrService(WebService):
uci_service
=
OcrService
(
name
=
"ocr"
)
uci_service
.
prepare_pipeline_config
(
"config.yml"
)
FLAGS
=
ArgsParser
().
parse_args
()
uci_service
.
prepare_pipeline_config
(
yml_dict
=
FLAGS
.
conf_dict
)
uci_service
.
run_service
()
test_tipc/docs/test_serving.md
浏览文件 @
c01e526f
...
...
@@ -20,10 +20,10 @@ PaddleServing预测功能测试的主程序为`test_serving.sh`,可以测试
先运行
`prepare.sh`
准备数据和模型,然后运行
`test_serving.sh`
进行测试,最终在
```test_tipc/output```
目录下生成
`serving_infer_*.log`
后缀的日志文件。
```
shell
bash test_tipc/prepare.sh ./test_tipc/configs/
ppocr_det_mobile
/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
"serving_infer"
bash test_tipc/prepare.sh ./test_tipc/configs/
ch_ppocr_mobile_v2.0_det
/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
"serving_infer"
# 用法:
bash test_tipc/test_serving.sh ./test_tipc/configs/
ppocr_det_mobile
/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
bash test_tipc/test_serving.sh ./test_tipc/configs/
ch_ppocr_mobile_v2.0_det
/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
```
#### 运行结果
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录