Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
PaddleDetection
提交
195ebcf0
P
PaddleDetection
项目概览
PaddlePaddle
/
PaddleDetection
大约 1 年 前同步成功
通知
695
Star
11112
Fork
2696
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
184
列表
看板
标记
里程碑
合并请求
40
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleDetection
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
184
Issue
184
列表
看板
标记
里程碑
合并请求
40
合并请求
40
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
未验证
提交
195ebcf0
编写于
6月 20, 2022
作者:
S
shangliang Xu
提交者:
GitHub
6月 20, 2022
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
[TIPC] add onnx_infer shell, test=document_fix (#6225)
上级
997fec86
变更
27
显示空白变更内容
内联
并排
Showing
27 changed file
with
838 addition
and
87 deletion
+838
-87
deploy/third_engine/onnx/infer.py
deploy/third_engine/onnx/infer.py
+11
-24
deploy/third_engine/onnx/preprocess.py
deploy/third_engine/onnx/preprocess.py
+75
-0
test_tipc/README.md
test_tipc/README.md
+1
-1
test_tipc/configs/keypoint/tinypose_128x96_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/mask_rcnn/mask_rcnn_r50_fpn_1x_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/picodet/picodet_l_640_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/picodet/picodet_lcnet_1_5x_416_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/picodet/picodet_m_416_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/picodet/picodet_mobilenetv3_large_1x_416_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/picodet/picodet_r18_640_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/picodet/picodet_s_320_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/picodet/picodet_shufflenetv2_1x_416_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyolo/ppyolo_mbv3_large_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyolo/ppyolo_mbv3_small_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyolo/ppyolo_r18vd_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyolo/ppyolo_r50vd_dcn_1x_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyolo/ppyolo_tiny_650e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyolo/ppyolov2_r101vd_dcn_365e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyolo/ppyolov2_r50vd_dcn_365e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyoloe/ppyoloe_crn_l_300e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyoloe/ppyoloe_crn_m_300e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyoloe/ppyoloe_crn_s_300e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/ppyoloe/ppyoloe_crn_x_300e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/configs/yolov3/yolov3_darknet53_270e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+30
-0
test_tipc/docs/test_paddle2onnx.md
test_tipc/docs/test_paddle2onnx.md
+47
-0
test_tipc/prepare.sh
test_tipc/prepare.sh
+3
-3
test_tipc/test_paddle2onnx.sh
test_tipc/test_paddle2onnx.sh
+71
-59
未找到文件。
deploy/third_engine/onnx/infer.py
浏览文件 @
195ebcf0
...
...
@@ -23,25 +23,9 @@ from preprocess import Compose
# Global dictionary
SUPPORT_MODELS
=
{
'YOLO'
,
'RCNN'
,
'SSD'
,
'Face'
,
'FCOS'
,
'SOLOv2'
,
'TTFNet'
,
'S2ANet'
,
'JDE'
,
'FairMOT'
,
'DeepSORT'
,
'GFL'
,
'PicoDet'
,
'CenterNet'
,
'TOOD'
,
'RetinaNet'
,
'StrongBaseline'
,
'STGCN'
,
'YOLOX'
,
'YOLO'
,
'RCNN'
,
'SSD'
,
'Face'
,
'FCOS'
,
'SOLOv2'
,
'TTFNet'
,
'S2ANet'
,
'JDE'
,
'FairMOT'
,
'DeepSORT'
,
'GFL'
,
'PicoDet'
,
'CenterNet'
,
'TOOD'
,
'RetinaNet'
,
'StrongBaseline'
,
'STGCN'
,
'YOLOX'
,
'HRNet'
}
parser
=
argparse
.
ArgumentParser
(
description
=
__doc__
)
...
...
@@ -142,6 +126,9 @@ def predict_image(infer_config, predictor, img_list):
outputs
=
predictor
.
run
(
output_names
=
None
,
input_feed
=
inputs
)
print
(
"ONNXRuntime predict: "
)
if
infer_config
.
arch
in
[
"HRNet"
]:
print
(
np
.
array
(
outputs
[
0
]))
else
:
bboxes
=
np
.
array
(
outputs
[
0
])
for
bbox
in
bboxes
:
if
bbox
[
0
]
>
-
1
and
bbox
[
1
]
>
infer_config
.
draw_threshold
:
...
...
deploy/third_engine/onnx/preprocess.py
浏览文件 @
195ebcf0
...
...
@@ -399,6 +399,81 @@ class WarpAffine(object):
return
inp
,
im_info
# keypoint preprocess
def
get_warp_matrix
(
theta
,
size_input
,
size_dst
,
size_target
):
"""This code is based on
https://github.com/open-mmlab/mmpose/blob/master/mmpose/core/post_processing/post_transforms.py
Calculate the transformation matrix under the constraint of unbiased.
Paper ref: Huang et al. The Devil is in the Details: Delving into Unbiased
Data Processing for Human Pose Estimation (CVPR 2020).
Args:
theta (float): Rotation angle in degrees.
size_input (np.ndarray): Size of input image [w, h].
size_dst (np.ndarray): Size of output image [w, h].
size_target (np.ndarray): Size of ROI in input plane [w, h].
Returns:
matrix (np.ndarray): A matrix for transformation.
"""
theta
=
np
.
deg2rad
(
theta
)
matrix
=
np
.
zeros
((
2
,
3
),
dtype
=
np
.
float32
)
scale_x
=
size_dst
[
0
]
/
size_target
[
0
]
scale_y
=
size_dst
[
1
]
/
size_target
[
1
]
matrix
[
0
,
0
]
=
np
.
cos
(
theta
)
*
scale_x
matrix
[
0
,
1
]
=
-
np
.
sin
(
theta
)
*
scale_x
matrix
[
0
,
2
]
=
scale_x
*
(
-
0.5
*
size_input
[
0
]
*
np
.
cos
(
theta
)
+
0.5
*
size_input
[
1
]
*
np
.
sin
(
theta
)
+
0.5
*
size_target
[
0
])
matrix
[
1
,
0
]
=
np
.
sin
(
theta
)
*
scale_y
matrix
[
1
,
1
]
=
np
.
cos
(
theta
)
*
scale_y
matrix
[
1
,
2
]
=
scale_y
*
(
-
0.5
*
size_input
[
0
]
*
np
.
sin
(
theta
)
-
0.5
*
size_input
[
1
]
*
np
.
cos
(
theta
)
+
0.5
*
size_target
[
1
])
return
matrix
class
TopDownEvalAffine
(
object
):
"""apply affine transform to image and coords
Args:
trainsize (list): [w, h], the standard size used to train
use_udp (bool): whether to use Unbiased Data Processing.
records(dict): the dict contained the image and coords
Returns:
records (dict): contain the image and coords after tranformed
"""
def
__init__
(
self
,
trainsize
,
use_udp
=
False
):
self
.
trainsize
=
trainsize
self
.
use_udp
=
use_udp
def
__call__
(
self
,
image
,
im_info
):
rot
=
0
imshape
=
im_info
[
'im_shape'
][::
-
1
]
center
=
im_info
[
'center'
]
if
'center'
in
im_info
else
imshape
/
2.
scale
=
im_info
[
'scale'
]
if
'scale'
in
im_info
else
imshape
if
self
.
use_udp
:
trans
=
get_warp_matrix
(
rot
,
center
*
2.0
,
[
self
.
trainsize
[
0
]
-
1.0
,
self
.
trainsize
[
1
]
-
1.0
],
scale
)
image
=
cv2
.
warpAffine
(
image
,
trans
,
(
int
(
self
.
trainsize
[
0
]),
int
(
self
.
trainsize
[
1
])),
flags
=
cv2
.
INTER_LINEAR
)
else
:
trans
=
get_affine_transform
(
center
,
scale
,
rot
,
self
.
trainsize
)
image
=
cv2
.
warpAffine
(
image
,
trans
,
(
int
(
self
.
trainsize
[
0
]),
int
(
self
.
trainsize
[
1
])),
flags
=
cv2
.
INTER_LINEAR
)
return
image
,
im_info
class
Compose
:
def
__init__
(
self
,
transforms
):
self
.
transforms
=
[]
...
...
test_tipc/README.md
浏览文件 @
195ebcf0
...
...
@@ -109,4 +109,4 @@ bash test_tipc/test_train_inference_python.sh ./test_tipc/configs/yolov3/yolov3_
-
[
test_inference_cpp 使用
](
docs/test_inference_cpp.md
)
:测试基于C++的模型推理。
-
[
test_serving 使用
](
docs/test_serving.md
)
:测试基于Paddle Serving的服务化部署功能。
-
[
test_lite_arm_cpu_cpp 使用
](
./
)
:测试基于Paddle-Lite的ARM CPU端c++预测部署功能。
-
[
test_paddle2onnx 使用
](
./
)
:测试Paddle2ONNX的模型转化功能,并验证正确性。
-
[
test_paddle2onnx 使用
](
docs/test_paddle2onnx.md
)
:测试Paddle2ONNX的模型转化功能,并验证正确性。
test_tipc/configs/keypoint/tinypose_128x96_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:tinypose_128x96
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/keypoint/tinypose_128x96.pdparams
norm_export:tools/export_model.py -c configs/keypoint/tiny_pose/tinypose_128x96.yml -o
quant_export:tools/export_model.py -c configs/keypoint/tiny_pose/tinypose_128x96.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/keypoint/tiny_pose/tinypose_128x96.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/keypoint/tiny_pose/yolov3_darknet53_270e_coco.yml --slim_config configs/slim/post_quant/tinypose_128x96_ptq.yml -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/hrnet_demo.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/mask_rcnn/mask_rcnn_r50_fpn_1x_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:mask_rcnn_r50_fpn_1x_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/mask_rcnn_r50_fpn_1x_coco.pdparams
norm_export:tools/export_model.py -c configs/mask_rcnn/mask_rcnn_r50_fpn_1x_coco.yml -o
quant_export:tools/export_model.py -c configs/mask_rcnn/mask_rcnn_r50_fpn_1x_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/mask_rcnn/mask_rcnn_r50_fpn_1x_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_onnx:True
kl_quant_export:tools/post_quant.py -c configs/mask_rcnn/yolov3_darknet53_270e_coco.yml --slim_config configs/slim/post_quant/mask_rcnn_r50_fpn_1x_coco_ptq.yml -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:16
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/picodet/picodet_l_640_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:picodet_l_640_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/picodet_l_640_coco.pdparams
norm_export:tools/export_model.py -c configs/picodet/legacy_model/picodet_l_640_coco.yml -o
quant_export:tools/export_model.py -c configs/picodet/legacy_model/picodet_l_640_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/picodet/legacy_model/picodet_l_640_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/picodet/legacy_model/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/picodet/picodet_lcnet_1_5x_416_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:picodet_lcnet_1_5x_416_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/picodet_lcnet_1_5x_416_coco.pdparams
norm_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_lcnet_1_5x_416_coco.yml -o
quant_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_lcnet_1_5x_416_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_lcnet_1_5x_416_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/picodet/legacy_model/more_config/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/picodet/picodet_m_416_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:picodet_m_416_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/picodet_m_416_coco.pdparams
norm_export:tools/export_model.py -c configs/picodet/legacy_model/picodet_m_416_coco.yml -o
quant_export:tools/export_model.py -c configs/picodet/legacy_model/picodet_m_416_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/picodet/legacy_model/picodet_m_416_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/picodet/legacy_model/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/picodet/picodet_mobilenetv3_large_1x_416_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:picodet_mobilenetv3_large_1x_416_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/picodet_mobilenetv3_large_1x_416_coco.pdparams
norm_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_mobilenetv3_large_1x_416_coco.yml -o
quant_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_mobilenetv3_large_1x_416_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_mobilenetv3_large_1x_416_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/picodet/legacy_model/more_config/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/picodet/picodet_r18_640_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:picodet_r18_640_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/picodet_r18_640_coco.pdparams
norm_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_r18_640_coco.yml -o
quant_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_r18_640_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_r18_640_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/picodet/legacy_model/more_config/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/picodet/picodet_s_320_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:picodet_s_320_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/picodet_s_320_coco.pdparams
norm_export:tools/export_model.py -c configs/picodet/legacy_model/picodet_s_320_coco.yml -o
quant_export:tools/export_model.py -c configs/picodet/legacy_model/picodet_s_320_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/picodet/legacy_model/picodet_s_320_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/picodet/legacy_model/yolov3_darknet53_270e_coco.yml --slim_config configs/slim/post_quant/picodet_s_ptq.yml -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/picodet/picodet_shufflenetv2_1x_416_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:picodet_shufflenetv2_1x_416_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/picodet_shufflenetv2_1x_416_coco.pdparams
norm_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_shufflenetv2_1x_416_coco.yml -o
quant_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_shufflenetv2_1x_416_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/picodet/legacy_model/more_config/picodet_shufflenetv2_1x_416_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/picodet/legacy_model/more_config/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyolo/ppyolo_mbv3_large_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyolo_mbv3_large_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyolo_mbv3_large_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyolo/ppyolo_mbv3_large_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyolo/ppyolo_mbv3_large_coco.yml --slim_config configs/slim/quant/ppyolo_mbv3_large_qat.yml -o
fpgm_export:tools/export_model.py -c configs/ppyolo/ppyolo_mbv3_large_coco.yml --slim_config configs/slim/prune/ppyolo_mbv3_large_prune_fpgm.yml -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyolo/yolov3_darknet53_270e_coco.yml --slim_config configs/slim/post_quant/ppyolo_mbv3_large_ptq.yml -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyolo/ppyolo_mbv3_small_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyolo_mbv3_small_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyolo_mbv3_small_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyolo/ppyolo_mbv3_small_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyolo/ppyolo_mbv3_small_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/ppyolo/ppyolo_mbv3_small_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyolo/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyolo/ppyolo_r18vd_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyolo_r18vd_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyolo_r18vd_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyolo/ppyolo_r18vd_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyolo/ppyolo_r18vd_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/ppyolo/ppyolo_r18vd_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyolo/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyolo/ppyolo_r50vd_dcn_1x_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyolo_r50vd_dcn_1x_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyolo_r50vd_dcn_1x_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyolo/ppyolo_r50vd_dcn_1x_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyolo/ppyolo_r50vd_dcn_1x_coco.yml --slim_config configs/slim/quant/ppyolo_r50vd_qat_pact.yml -o
fpgm_export:tools/export_model.py -c configs/ppyolo/ppyolo_r50vd_dcn_1x_coco.yml --slim_config configs/slim/prune/ppyolo_r50vd_prune_fpgm.yml -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyolo/yolov3_darknet53_270e_coco.yml --slim_config configs/slim/post_quant/ppyolo_r50vd_dcn_ptq.yml -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyolo/ppyolo_tiny_650e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyolo_tiny_650e_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyolo_tiny_650e_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyolo/ppyolo_tiny_650e_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyolo/ppyolo_tiny_650e_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/ppyolo/ppyolo_tiny_650e_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyolo/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyolo/ppyolov2_r101vd_dcn_365e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyolov2_r101vd_dcn_365e_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyolov2_r101vd_dcn_365e_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyolo/ppyolov2_r101vd_dcn_365e_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyolo/ppyolov2_r101vd_dcn_365e_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/ppyolo/ppyolov2_r101vd_dcn_365e_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyolo/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyolo/ppyolov2_r50vd_dcn_365e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyolov2_r50vd_dcn_365e_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyolov2_r50vd_dcn_365e_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyolo/ppyolov2_r50vd_dcn_365e_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyolo/ppyolov2_r50vd_dcn_365e_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/ppyolo/ppyolov2_r50vd_dcn_365e_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyolo/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyoloe/ppyoloe_crn_l_300e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyoloe_crn_l_300e_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyoloe_crn_l_300e_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_l_300e_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_l_300e_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_l_300e_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyoloe/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyoloe/ppyoloe_crn_m_300e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyoloe_crn_m_300e_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyoloe_crn_m_300e_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_m_300e_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_m_300e_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_m_300e_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyoloe/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyoloe/ppyoloe_crn_s_300e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyoloe_crn_s_300e_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyoloe_crn_s_300e_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_s_300e_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_s_300e_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_s_300e_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyoloe/yolov3_darknet53_270e_coco.yml --slim_config configs/slim/post_quant/ppyoloe_crn_s_300e_coco_ptq.yml -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/ppyoloe/ppyoloe_crn_x_300e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:ppyoloe_crn_x_300e_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/ppyoloe_crn_x_300e_coco.pdparams
norm_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_x_300e_coco.yml -o
quant_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_x_300e_coco.yml --slim_config _template_pact -o
fpgm_export:tools/export_model.py -c configs/ppyoloe/ppyoloe_crn_x_300e_coco.yml --slim_config _template_fpgm -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/ppyoloe/yolov3_darknet53_270e_coco.yml --slim_config _template_kl_quant -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/configs/yolov3/yolov3_darknet53_270e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
195ebcf0
===========================paddle2onnx_params===========================
model_name:yolov3_darknet53_270e_coco
python:python3.7
filename:null
##
--output_dir:./output_inference
weights:https://paddledet.bj.bcebos.com/models/yolov3_darknet53_270e_coco.pdparams
norm_export:tools/export_model.py -c configs/yolov3/yolov3_darknet53_270e_coco.yml -o
quant_export:tools/export_model.py -c configs/yolov3/yolov3_darknet53_270e_coco.yml --slim_config configs/slim/quant/yolov3_darknet_qat.yml -o
fpgm_export:tools/export_model.py -c configs/yolov3/yolov3_darknet53_270e_coco.yml --slim_config configs/slim/prune/yolov3_darknet_prune_fpgm.yml -o
distill_export:null
export1:null
export_param:null
kl_quant_export:tools/post_quant.py -c configs/yolov3/yolov3_darknet53_270e_coco.yml --slim_config configs/slim/post_quant/yolov3_darknet53_ptq.yml -o
##
infer_mode:norm
infer_quant:False
cmd:paddle2onnx
--model_dir:null
--model_filename:model.pdmodel
--params_filename:model.pdiparams
--save_file:model.onnx
--opset_version:11
--enable_onnx_checker:True
paddle2onnx_param1:null
infer_py:./deploy/third_engine/onnx/infer.py
--infer_cfg:null
--onnx_file:null
--image_file:./demo/000000014439.jpg
infer_param1:null
\ No newline at end of file
test_tipc/docs/test_paddle2onnx.md
0 → 100644
浏览文件 @
195ebcf0
# Paddle2onnx预测功能测试
PaddleServing预测功能测试的主程序为
`test_paddle2onnx.sh`
,可以测试Paddle2ONNX的模型转化功能,并验证正确性。
## 1. 测试结论汇总
基于训练是否使用量化,进行本测试的模型可以分为
`正常模型`
和
`量化模型`
,这两类模型对应的Paddle2ONNX预测功能汇总如下:
| 模型类型 |device |
| ---- | ---- |
| 正常模型 | GPU |
| 正常模型 | CPU |
| 量化模型 | GPU |
| 量化模型 | CPU |
## 2. 测试流程
### 2.1 功能测试
先运行
`prepare.sh`
准备数据和模型,然后运行
`test_paddle2onnx.sh`
进行测试,最终在
```test_tipc/output```
目录下生成
`paddle2onnx_infer_*.log`
后缀的日志文件。
```
shell
bash test_tipc/prepare.sh ./test_tipc/configs/yolov3/yolov3_darknet53_270e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
"paddle2onnx_infer"
# 用法:
bash test_tipc/test_paddle2onnx.sh ./test_tipc/configs/yolov3/yolov3_darknet53_270e_coco_model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
```
#### 运行结果
各测试的运行情况会打印在
`test_tipc/output/results_paddle2onnx.log`
中:
运行成功时会输出:
```
Run successfully with command - yolov3_darknet53_270e_coco - paddle2onnx --model_dir=./output_inference/yolov3_darknet53_270e_coco --model_filename=model.pdmodel --params_filename=model.pdiparams --save_file=./output_inference/yolov3_darknet53_270e_coco/model.onnx --opset_version=11 --enable_onnx_checker=True !
Run successfully with command - yolov3_darknet53_270e_coco - python3.7 ./deploy/third_engine/onnx/infer.py --infer_cfg=./output_inference/yolov3_darknet53_270e_coco/infer_cfg.yml --onnx_file=./output_inference/yolov3_darknet53_270e_coco/model.onnx --image_file=./demo/000000014439.jpg > ./test_tipc/output/paddle2onnx_infer_cpu.log 2>&1 !
```
运行失败时会输出:
```
Run failed with command - yolov3_darknet53_270e_coco - paddle2onnx --model_dir=./output_inference/yolov3_darknet53_270e_coco --model_filename=model.pdmodel --params_filename=model.pdiparams --save_file=./output_inference/yolov3_darknet53_270e_coco/model.onnx --opset_version=11 --enable_onnx_checker=True !
...
```
## 3. 更多教程
本文档为功能测试用,更详细的Paddle2onnx预测使用教程请参考:
[
Paddle2ONNX
](
https://github.com/PaddlePaddle/Paddle2ONNX
)
test_tipc/prepare.sh
浏览文件 @
195ebcf0
...
...
@@ -76,9 +76,9 @@ elif [ ${MODE} = "benchmark_train" ];then
ls
./
cd
../../
elif
[
${
MODE
}
=
"paddle2onnx_infer"
]
;
then
#
set paddle2onnx_infer enve
${
python
}
-m
pip
install
install
paddle2onnx
${
python
}
-m
pip
install
onnx
runtime
==
1.10.0
#
install paddle2onnx
${
python
}
-m
pip
install
paddle2onnx
${
python
}
-m
pip
install
onnx
onnxruntime
elif
[
${
MODE
}
=
"serving_infer"
]
;
then
unset
https_proxy http_proxy
else
...
...
test_tipc/test_paddle2onnx.sh
浏览文件 @
195ebcf0
...
...
@@ -24,70 +24,79 @@ fpgm_export=$(func_parser_value "${lines[9]}")
distill_export
=
$(
func_parser_value
"
${
lines
[10]
}
"
)
export_key1
=
$(
func_parser_key
"
${
lines
[11]
}
"
)
export_value1
=
$(
func_parser_value
"
${
lines
[11]
}
"
)
export_
key2
=
$(
func_parser_key
"
${
lines
[12]
}
"
)
export_
value2
=
$(
func_parser_value
"
${
lines
[12]
}
"
)
export_
param_key
=
$(
func_parser_key
"
${
lines
[12]
}
"
)
export_
param_value
=
$(
func_parser_value
"
${
lines
[12]
}
"
)
kl_quant_export
=
$(
func_parser_value
"
${
lines
[13]
}
"
)
# parser paddle2onnx
padlle2onnx_cmd
=
$(
func_parser_value
"
${
lines
[15]
}
"
)
infer_model_dir_key
=
$(
func_parser_key
"
${
lines
[16]
}
"
)
infer_model_dir_value
=
$(
func_parser_value
"
${
lines
[16]
}
"
)
model_filename_key
=
$(
func_parser_key
"
${
lines
[17]
}
"
)
model_filename_value
=
$(
func_parser_value
"
${
lines
[17]
}
"
)
params_filename_key
=
$(
func_parser_key
"
${
lines
[18]
}
"
)
params_filename_value
=
$(
func_parser_value
"
${
lines
[18]
}
"
)
save_file_key
=
$(
func_parser_key
"
${
lines
[19]
}
"
)
save_file_value
=
$(
func_parser_value
"
${
lines
[19]
}
"
)
opset_version_key
=
$(
func_parser_key
"
${
lines
[20]
}
"
)
opset_version_value
=
$(
func_parser_value
"
${
lines
[20]
}
"
)
# parser paddle2onnx params
infer_mode_list
=
$(
func_parser_value
"
${
lines
[15]
}
"
)
infer_is_quant_list
=
$(
func_parser_value
"
${
lines
[16]
}
"
)
padlle2onnx_cmd
=
$(
func_parser_value
"
${
lines
[17]
}
"
)
model_dir_key
=
$(
func_parser_key
"
${
lines
[18]
}
"
)
model_filename_key
=
$(
func_parser_key
"
${
lines
[19]
}
"
)
model_filename_value
=
$(
func_parser_value
"
${
lines
[19]
}
"
)
params_filename_key
=
$(
func_parser_key
"
${
lines
[20]
}
"
)
params_filename_value
=
$(
func_parser_value
"
${
lines
[20]
}
"
)
save_file_key
=
$(
func_parser_key
"
${
lines
[21]
}
"
)
save_file_value
=
$(
func_parser_value
"
${
lines
[21]
}
"
)
opset_version_key
=
$(
func_parser_key
"
${
lines
[22]
}
"
)
opset_version_value
=
$(
func_parser_value
"
${
lines
[22]
}
"
)
enable_onnx_checker_key
=
$(
func_parser_key
"
${
lines
[23]
}
"
)
enable_onnx_checker_value
=
$(
func_parser_value
"
${
lines
[23]
}
"
)
paddle2onnx_params1_key
=
$(
func_parser_key
"
${
lines
[24]
}
"
)
paddle2onnx_params1_value
=
$(
func_parser_value
"
${
lines
[24]
}
"
)
# parser onnx inference
inference_py
=
$(
func_parser_value
"
${
lines
[22]
}
"
)
model_file_key
=
$(
func_parser_key
"
${
lines
[23]
}
"
)
model_file_value
=
$(
func_parser_value
"
${
lines
[23]
}
"
)
img_fold_key
=
$(
func_parser_key
"
${
lines
[24]
}
"
)
img_fold_value
=
$(
func_parser_value
"
${
lines
[24]
}
"
)
results_fold_key
=
$(
func_parser_key
"
${
lines
[25]
}
"
)
results_fold_value
=
$(
func_parser_value
"
${
lines
[25]
}
"
)
onnx_infer_mode_list
=
$(
func_parser_value
"
${
lines
[26]
}
"
)
inference_py
=
$(
func_parser_value
"
${
lines
[25]
}
"
)
infer_cfg_key
=
$(
func_parser_key
"
${
lines
[26]
}
"
)
onnx_file_key
=
$(
func_parser_key
"
${
lines
[27]
}
"
)
infer_image_key
=
$(
func_parser_key
"
${
lines
[28]
}
"
)
infer_image_value
=
$(
func_parser_value
"
${
lines
[28]
}
"
)
infer_param1_key
=
$(
func_parser_key
"
${
lines
[29]
}
"
)
infer_param1_value
=
$(
func_parser_value
"
${
lines
[29]
}
"
)
LOG_PATH
=
"./test_tipc/output"
mkdir
-p
${
LOG_PATH
}
status_log
=
"
${
LOG_PATH
}
/results_paddle2onnx.log"
function
func_paddle2onnx
(){
function
func_paddle2onnx
_inference
(){
IFS
=
'|'
_script
=
$1
_python
=
$1
_log_path
=
$2
_export_model_dir
=
$3
# paddle2onnx
echo
"################### run
onnx export
###################"
echo
"################### run
paddle2onnx
###################"
_save_log_path
=
"
${
LOG_PATH
}
/paddle2onnx_infer_cpu.log"
set_dirname
=
$(
func_set_params
"
${
infer_model_dir_key
}
"
"
${
infer_model_dir_value
}
"
)
set_dirname
=
$(
func_set_params
"
${
model_dir_key
}
"
"
${
_export_model_dir
}
"
)
set_model_filename
=
$(
func_set_params
"
${
model_filename_key
}
"
"
${
model_filename_value
}
"
)
set_params_filename
=
$(
func_set_params
"
${
params_filename_key
}
"
"
${
params_filename_value
}
"
)
set_save_model
=
$(
func_set_params
"
${
save_file_key
}
"
"
${
save_file_value
}
"
)
set_save_model
=
$(
func_set_params
"
${
save_file_key
}
"
"
${
_export_model_dir
}
/
${
save_file_value
}
"
)
set_opset_version
=
$(
func_set_params
"
${
opset_version_key
}
"
"
${
opset_version_value
}
"
)
trans_model_cmd
=
"
${
padlle2onnx_cmd
}
${
set_dirname
}
${
set_model_filename
}
${
set_params_filename
}
${
set_save_model
}
${
set_opset_version
}
"
set_enable_onnx_checker
=
$(
func_set_params
"
${
enable_onnx_checker_key
}
"
"
${
enable_onnx_checker_value
}
"
)
set_paddle2onnx_params1
=
$(
func_set_params
"
${
paddle2onnx_params1_key
}
"
"
${
paddle2onnx_params1_value
}
"
)
trans_model_cmd
=
"
${
padlle2onnx_cmd
}
${
set_dirname
}
${
set_model_filename
}
${
set_params_filename
}
${
set_save_model
}
${
set_opset_version
}
${
set_enable_onnx_checker
}
${
set_paddle2onnx_params1
}
"
eval
$trans_model_cmd
last_status
=
${
PIPESTATUS
[0]
}
status_check
$last_status
"
${
trans_model_cmd
}
"
"
${
status_log
}
"
status_check
$last_status
"
${
trans_model_cmd
}
"
"
${
status_log
}
"
"
${
model_name
}
"
# python inference
echo
"################### run infer ###################"
cd
./deploy/third_engine/demo_onnxruntime/
model_file
=
$(
func_set_params
"
${
model_file_key
}
"
"
${
model
_file_value
}
"
)
img_fold
=
$(
func_set_params
"
${
img_fold_key
}
"
"
${
img_fold
_value
}
"
)
results_fold
=
$(
func_set_params
"
${
results_fold_key
}
"
"
${
results_fold
_value
}
"
)
infer_model_cmd
=
"
${
python
}
${
inference_py
}
${
model_file
}
${
img_fold
}
${
results_fold
}
"
echo
"################### run
onnx
infer ###################"
set_infer_cfg
=
$(
func_set_params
"
${
infer_cfg_key
}
"
"
${
_export_model_dir
}
/infer_cfg.yml"
)
set_onnx_file
=
$(
func_set_params
"
${
onnx_file_key
}
"
"
${
_export_model_dir
}
/
${
save
_file_value
}
"
)
set_infer_image_file
=
$(
func_set_params
"
${
infer_image_key
}
"
"
${
infer_image
_value
}
"
)
set_infer_param1
=
$(
func_set_params
"
${
infer_param1_key
}
"
"
${
infer_param1
_value
}
"
)
infer_model_cmd
=
"
${
python
}
${
inference_py
}
${
set_infer_cfg
}
${
set_onnx_file
}
${
set_infer_image_file
}
${
set_infer_param1
}
>
${
_save_log_path
}
2>&1
"
eval
$infer_model_cmd
last_status
=
${
PIPESTATUS
[0]
}
status_check
$last_status
"
${
infer_model_cmd
}
"
"
${
status_log
}
"
status_check
$last_status
"
${
infer_model_cmd
}
"
"
${
status_log
}
"
"
${
model_name
}
"
}
export
Count
=
0
IFS
=
"|"
echo
"################### run paddle export ###################"
for
infer_mode
in
${
onnx_infer_mode_list
[*]
}
;
do
for
infer_mode
in
${
infer_mode_list
[*]
}
;
do
if
[
${
infer_mode
}
!=
"null"
]
;
then
# run export
case
${
infer_mode
}
in
norm
)
run_export
=
${
norm_export
}
;;
...
...
@@ -97,16 +106,19 @@ for infer_mode in ${onnx_infer_mode_list[*]}; do
kl_quant
)
run_export
=
${
kl_quant_export
}
;;
*
)
echo
"Undefined infer_mode!"
;
exit
1
;
esac
if
[
${
run_export
}
=
"null"
]
;
then
continue
fi
set_export_weight
=
$(
func_set_params
"
${
export_weight_key
}
"
"
${
export_weight_value
}
"
)
set_save_export_dir
=
$(
func_set_params
"
${
save_export_key
}
"
"
${
save_export_value
}
"
)
set_filename
=
$(
func_set_params
"
${
filename_key
}
"
"
${
model_name
}
"
)
export_cmd
=
"
${
python
}
${
run_export
}
${
set_export_weight
}
${
set_filename
}
${
set_save_export_dir
}
"
set_export_param
=
$(
func_set_params
"
${
export_param_key
}
"
"
${
export_param_value
}
"
)
export_cmd
=
"
${
python
}
${
run_export
}
${
set_export_weight
}
${
set_filename
}
${
set_export_param
}
${
set_save_export_dir
}
"
echo
$export_cmd
eval
$export_cmd
status_export
=
$?
status_check
$status_export
"
${
export_cmd
}
"
"
${
status_log
}
"
status_check
$status_export
"
${
export_cmd
}
"
"
${
status_log
}
"
"
${
model_name
}
"
fi
#run inference
export_model_dir
=
"
${
save_export_value
}
/
${
model_name
}
"
func_paddle2onnx_inference
"
${
python
}
"
"
${
LOG_PATH
}
"
"
${
export_model_dir
}
"
Count
=
$((
$Count
+
1
))
done
func_paddle2onnx
\ No newline at end of file
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录