Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
PaddleClas
提交
17401161
P
PaddleClas
项目概览
PaddlePaddle
/
PaddleClas
大约 1 年 前同步成功
通知
115
Star
4999
Fork
1114
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
19
列表
看板
标记
里程碑
合并请求
6
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleClas
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
19
Issue
19
列表
看板
标记
里程碑
合并请求
6
合并请求
6
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
17401161
编写于
6月 01, 2022
作者:
H
HydrogenSulfate
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
move model url to config
上级
f2b20cff
变更
24
隐藏空白更改
内联
并排
Showing
24 changed file
with
108 addition
and
233 deletion
+108
-233
test_tipc/config/MobileNetV3/MobileNetV3_large_x1_0_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+1
-0
test_tipc/config/PP-ShiTu/PPShiTu_general_rec_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+1
-0
test_tipc/config/PP-ShiTu/PPShiTu_mainbody_det_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+1
-0
test_tipc/config/PPHGNet/PPHGNet_small_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+0
-0
test_tipc/config/PPHGNet/PPHGNet_tiny_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+0
-0
test_tipc/config/PPLCNet/PPLCNet_x0_25_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+0
-0
test_tipc/config/PPLCNet/PPLCNet_x0_35_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+16
-0
test_tipc/config/PPLCNet/PPLCNet_x0_35_paddle2onnx_infer_python.txt
...config/PPLCNet/PPLCNet_x0_35_paddle2onnx_infer_python.txt
+0
-15
test_tipc/config/PPLCNet/PPLCNet_x0_5_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+16
-0
test_tipc/config/PPLCNet/PPLCNet_x0_5_paddle2onnx_infer_python.txt
.../config/PPLCNet/PPLCNet_x0_5_paddle2onnx_infer_python.txt
+0
-15
test_tipc/config/PPLCNet/PPLCNet_x0_75_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+1
-0
test_tipc/config/PPLCNet/PPLCNet_x1_0_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+1
-0
test_tipc/config/PPLCNet/PPLCNet_x1_5_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+1
-0
test_tipc/config/PPLCNet/PPLCNet_x2_0_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+16
-0
test_tipc/config/PPLCNet/PPLCNet_x2_5_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+1
-0
test_tipc/config/PPLCNetV2/PPLCNetV2_base_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+16
-0
test_tipc/config/ResNet/ResNet50_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+1
-0
test_tipc/config/ResNet/ResNet50_vd_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+2
-0
test_tipc/config/ResNet/ResNet50_vd_paddle2onnx_infer_python.txt
...pc/config/ResNet/ResNet50_vd_paddle2onnx_infer_python.txt
+0
-15
test_tipc/config/SwinTransformer/SwinTransformer_tiny_patch4_window7_224_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+1
-0
test_tipc/config/SwinTransformer/SwinTransformer_tiny_patch4_window7_224_paddle2onnx_infer_python.txt
...rmer_tiny_patch4_window7_224_paddle2onnx_infer_python.txt
+0
-15
test_tipc/docs/test_paddle2onnx.md
test_tipc/docs/test_paddle2onnx.md
+12
-9
test_tipc/prepare.sh
test_tipc/prepare.sh
+10
-153
test_tipc/test_paddle2onnx.sh
test_tipc/test_paddle2onnx.sh
+11
-11
未找到文件。
test_tipc/config/MobileNetV3/MobileNetV3_large_x1_0_
paddle2onnx_infer_python
.txt
→
test_tipc/config/MobileNetV3/MobileNetV3_large_x1_0_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
...
...
@@ -8,6 +8,7 @@ python:python3.7
--save_file:./deploy/models/MobileNetV3_large_x1_0_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/MobileNetV3_large_x1_0_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/MobileNetV3_large_x1_0_infer
...
...
test_tipc/config/PP-ShiTu/PPShiTu_general_rec_
paddle2onnx_infer_python
.txt
→
test_tipc/config/PP-ShiTu/PPShiTu_general_rec_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
...
...
@@ -8,6 +8,7 @@ python:python3.7
--save_file:./deploy/models/general_PPLCNet_x2_5_lite_v1.0_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/models/inference/general_PPLCNet_x2_5_lite_v1.0_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/general_PPLCNet_x2_5_lite_v1.0_infer
...
...
test_tipc/config/PP-ShiTu/PPShiTu_mainbody_det_
paddle2onnx_infer_python
.txt
→
test_tipc/config/PP-ShiTu/PPShiTu_mainbody_det_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
...
...
@@ -8,6 +8,7 @@ python:python3.7
--save_file:./deploy/models/picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/models/inference/picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer
...
...
test_tipc/config/PPHGNet/PPHGNet_small_
paddle2onnx_infer_python
.txt
→
test_tipc/config/PPHGNet/PPHGNet_small_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
文件已移动
test_tipc/config/PPHGNet/PPHGNet_tiny_
paddle2onnx_infer_python
.txt
→
test_tipc/config/PPHGNet/PPHGNet_tiny_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
文件已移动
test_tipc/config/PPLCNet/PPLCNet_x0_25_
paddle2onnx_infer_python
.txt
→
test_tipc/config/PPLCNet/PPLCNet_x0_25_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
文件已移动
test_tipc/config/PPLCNet/PPLCNet_x0_35_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
17401161
===========================paddle2onnx_params===========================
model_name:PPLCNet_x0_25
python:python3.7
2onnx: paddle2onnx
--model_dir:./deploy/models/PPLCNet_x0_25_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./deploy/models/PPLCNet_x0_25_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_25_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/PPLCNet_x0_25_infer
Global.use_gpu:False
-c:configs/inference_cls.yaml
\ No newline at end of file
test_tipc/config/PPLCNet/PPLCNet_x0_35_paddle2onnx_infer_python.txt
已删除
100644 → 0
浏览文件 @
f2b20cff
===========================paddle2onnx_params===========================
model_name:PPLCNet_x0_35
python:python3.7
2onnx: paddle2onnx
--model_dir:./deploy/models/PPLCNet_x0_35_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./deploy/models/PPLCNet_x0_35_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/PPLCNet_x0_35_infer
Global.use_gpu:False
-c:configs/inference_cls.yaml
\ No newline at end of file
test_tipc/config/PPLCNet/PPLCNet_x0_5_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
17401161
===========================paddle2onnx_params===========================
model_name:PP-ShiTu_mainbody_det
python:python3.7
2onnx: paddle2onnx
--model_dir:./deploy/models/picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./deploy/models/picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/models/inference/picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer
Global.use_gpu:False
-c:configs/inference_cls.yaml
\ No newline at end of file
test_tipc/config/PPLCNet/PPLCNet_x0_5_paddle2onnx_infer_python.txt
已删除
100644 → 0
浏览文件 @
f2b20cff
===========================paddle2onnx_params===========================
model_name:PPLCNet_x0_5
python:python3.7
2onnx: paddle2onnx
--model_dir:./deploy/models/PPLCNet_x0_5_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./deploy/models/PPLCNet_x0_5_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/PPLCNet_x0_5_infer
Global.use_gpu:False
-c:configs/inference_cls.yaml
\ No newline at end of file
test_tipc/config/PPLCNet/PPLCNet_x0_75_
paddle2onnx_infer_python
.txt
→
test_tipc/config/PPLCNet/PPLCNet_x0_75_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
...
...
@@ -8,6 +8,7 @@ python:python3.7
--save_file:./deploy/models/PPLCNet_x0_75_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_75_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/PPLCNet_x0_75_infer
...
...
test_tipc/config/PPLCNet/PPLCNet_x1_0_
paddle2onnx_infer_python
.txt
→
test_tipc/config/PPLCNet/PPLCNet_x1_0_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
...
...
@@ -8,6 +8,7 @@ python:python3.7
--save_file:./deploy/models/PPLCNet_x1_0_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x1_0_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/PPLCNet_x1_0_infer
...
...
test_tipc/config/PPLCNet/PPLCNet_x1_5_
paddle2onnx_infer_python
.txt
→
test_tipc/config/PPLCNet/PPLCNet_x1_5_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
...
...
@@ -8,6 +8,7 @@ python:python3.7
--save_file:./deploy/models/PPLCNet_x1_5_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x1_5_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/PPLCNet_x1_5_infer
...
...
test_tipc/config/PPLCNet/PPLCNet_x2_0_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
17401161
===========================paddle2onnx_params===========================
model_name:PPLCNet_x2_0
python:python3.7
2onnx: paddle2onnx
--model_dir:./deploy/models/PPLCNet_x2_0_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./deploy/models/PPLCNet_x2_0_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x2_0_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/PPLCNet_x2_0_infer
Global.use_gpu:False
-c:configs/inference_cls.yaml
\ No newline at end of file
test_tipc/config/PPLCNet/PPLCNet_x2_5_
paddle2onnx_infer_python
.txt
→
test_tipc/config/PPLCNet/PPLCNet_x2_5_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
...
...
@@ -8,6 +8,7 @@ python:python3.7
--save_file:./deploy/models/PPLCNet_x2_5_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x2_5_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/PPLCNet_x2_5_infer
...
...
test_tipc/config/PPLCNetV2/PPLCNetV2_base_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
浏览文件 @
17401161
===========================paddle2onnx_params===========================
model_name:PPLCNetV2_base
python:python3.7
2onnx: paddle2onnx
--model_dir:./deploy/models/PPLCNetV2_base_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./deploy/models/PPLCNetV2_base_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNetV2_base_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/PPLCNetV2_base_infer
Global.use_gpu:False
-c:configs/inference_cls.yaml
\ No newline at end of file
test_tipc/config/ResNet/ResNet50_
paddle2onnx_infer_python
.txt
→
test_tipc/config/ResNet/ResNet50_
linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
...
...
@@ -8,6 +8,7 @@ python:python3.7
--save_file:./deploy/models/ResNet50_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/ResNet50_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/ResNet50_infer
...
...
test_tipc/config/ResNet/ResNet50_vd_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
浏览文件 @
17401161
...
...
@@ -8,7 +8,9 @@ python:python3.7
--save_file:./deploy/models/ResNet50_vd_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/ResNet50_vd_infer.tar
inference: python/predict_cls.py -c configs/inference_cls.yaml
Global.use_onnx:True
Global.inference_model_dir:models/ResNet50_vd_infer/
Global.use_gpu:False
-c:configs/inference_cls.yaml
test_tipc/config/ResNet/ResNet50_vd_paddle2onnx_infer_python.txt
已删除
100644 → 0
浏览文件 @
f2b20cff
===========================paddle2onnx_params===========================
model_name:ResNet50_vd
python:python3.7
2onnx: paddle2onnx
--model_dir:./deploy/models/ResNet50_vd_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./deploy/models/ResNet50_vd_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/ResNet50_vd_infer
Global.use_gpu:False
-c:configs/inference_cls.yaml
\ No newline at end of file
test_tipc/config/
PPLCNet/PPLCNet_x2_0_paddle2onnx_infer_python
.txt
→
test_tipc/config/
SwinTransformer/SwinTransformer_tiny_patch4_window7_224_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
浏览文件 @
17401161
...
...
@@ -8,6 +8,7 @@ python:python3.7
--save_file:./deploy/models/SwinTransformer_tiny_patch4_window7_224_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/SwinTransformer_tiny_patch4_window7_224_infer.tar
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/SwinTransformer_tiny_patch4_window7_224_infer
...
...
test_tipc/config/SwinTransformer/SwinTransformer_tiny_patch4_window7_224_paddle2onnx_infer_python.txt
已删除
100644 → 0
浏览文件 @
f2b20cff
===========================paddle2onnx_params===========================
model_name:SwinTransformer_tiny_patch4_window7_224
python:python3.7
2onnx: paddle2onnx
--model_dir:./deploy/models/SwinTransformer_tiny_patch4_window7_224_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./deploy/models/SwinTransformer_tiny_patch4_window7_224_infer/inference.onnx
--opset_version:10
--enable_onnx_checker:True
inference:./python/predict_cls.py
Global.use_onnx:True
Global.inference_model_dir:./models/SwinTransformer_tiny_patch4_window7_224_infer
Global.use_gpu:False
-c:configs/inference_cls.yaml
\ No newline at end of file
test_tipc/docs/test_paddle2onnx.md
浏览文件 @
17401161
...
...
@@ -10,36 +10,39 @@ PaddleServing预测功能测试的主程序为`test_paddle2onnx.sh`,可以测
| ---- | ---- |
| 正常模型 | GPU |
| 正常模型 | CPU |
| 量化模型 | GPU |
| 量化模型 | CPU |
## 2. 测试流程
以下内容以
`ResNet50`
模型的paddle2onnx测试为例
### 2.1 功能测试
先运行
`prepare.sh`
准备数据和模型,然后运行
`test_paddle2onnx.sh`
进行测试,最终在
`test_tipc/output`
目录下生成
`paddle2onnx_infer_*.log`
后缀的日志文件
先运行
`prepare.sh`
准备数据和模型,然后运行
`test_paddle2onnx.sh`
进行测试,最终在
`test_tipc/output
/ResNet50
`
目录下生成
`paddle2onnx_infer_*.log`
后缀的日志文件
下方展示以PPHGNet_small为例的测试命令与结果。
```
shell
bash test_tipc/prepare.sh ./test_tipc/config/
PPHGNet/PPHGNet_small_paddle2onnx_infer_python
.txt paddle2onnx_infer
bash test_tipc/prepare.sh ./test_tipc/config/
ResNet/ResNet50_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt paddle2onnx_infer
# 用法:
bash test_tipc/test_paddle2onnx.sh ./test_tipc/config/
PPHGNet/PPHGNet_small_paddle2onnx_infer_python
.txt
bash test_tipc/test_paddle2onnx.sh ./test_tipc/config/
ResNet/ResNet50_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
```
#### 运行结果
各测试的运行情况会打印在
`
test_tipc/output
/results_paddle2onnx.log`
中:
各测试的运行情况会打印在
`
./test_tipc/output/ResNet50
/results_paddle2onnx.log`
中:
运行成功时会输出:
```
Run successfully with command - paddle2onnx --model_dir=./deploy/models/
PPHGNet_tiny_infer/ --model_filename=inference.pdmodel --params_filename=inference.pdiparams --save_file=./deploy/models/PPHGNet_tiny
_infer/inference.onnx --opset_version=10 --enable_onnx_checker=True!
Run successfully with command - cd deploy && python3.7 ./python/predict_cls.py -o Global.inference_model_dir=./models/
PPHGNet_tiny_infer -o Global.use_onnx=True -o Global.use_gpu=False -c=configs/inference_cls.yaml > ../test_tipc/output
/paddle2onnx_infer_cpu.log 2>&1 && cd ../!
Run successfully with command - paddle2onnx --model_dir=./deploy/models/
ResNet50_infer/ --model_filename=inference.pdmodel --params_filename=inference.pdiparams --save_file=./deploy/models/ResNet50
_infer/inference.onnx --opset_version=10 --enable_onnx_checker=True!
Run successfully with command - cd deploy && python3.7 ./python/predict_cls.py -o Global.inference_model_dir=./models/
ResNet50_infer -o Global.use_onnx=True -o Global.use_gpu=False -c=configs/inference_cls.yaml > ../test_tipc/output/ResNet50
/paddle2onnx_infer_cpu.log 2>&1 && cd ../!
```
运行失败时会输出:
```
Run failed with command - paddle2onnx --model_dir=./deploy/models/PPHGNet_tiny_infer/ --model_filename=inference.pdmodel --params_filename=inference.pdiparams --save_file=./deploy/models/PPHGNet_tiny_infer/inference.onnx --opset_version=10 --enable_onnx_checker=True!
Run failed with command - paddle2onnx --model_dir=./deploy/models/ResNet50_infer/ --model_filename=inference.pdmodel --params_filename=inference.pdiparams --save_file=./deploy/models/ResNet50_infer/inference.onnx --opset_version=10 --enable_onnx_checker=True!
Run failed with command - cd deploy && python3.7 ./python/predict_cls.py -o Global.inference_model_dir=./models/ResNet50_infer -o Global.use_onnx=True -o Global.use_gpu=False -c=configs/inference_cls.yaml > ../test_tipc/output/ResNet50/paddle2onnx_infer_cpu.log 2>&1 && cd ../!
...
```
...
...
test_tipc/prepare.sh
浏览文件 @
17401161
...
...
@@ -174,161 +174,18 @@ fi
if
[
${
MODE
}
=
"paddle2onnx_infer"
]
;
then
# prepare paddle2onnx env
python_name
=
$(
func_parser_value
"
${
lines
[2]
}
"
)
inference_model_url
=
$(
func_parser_value
"
${
lines
[10]
}
"
)
tar_name
=
${
inference_model_url
##*/
}
${
python_name
}
-m
pip
install install
paddle2onnx
${
python_name
}
-m
pip
install
onnxruntime
if
[
${
model_name
}
==
"ResNet50"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/ResNet50_infer.tar
tar
xf ResNet50_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"ResNet50_vd"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/ResNet50_vd_infer.tar
tar
xf ResNet50_vd_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"MobileNetV3_large_x1_0"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/MobileNetV3_large_x1_0_infer.tar
tar
xf MobileNetV3_large_x1_0_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"SwinTransformer_tiny_patch4_window7_224"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/SwinTransformer_tiny_patch4_window7_224_infer.tar
tar
xf SwinTransformer_tiny_patch4_window7_224_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPLCNet_x0_25"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_25_infer.tar
tar
xf PPLCNet_x0_25_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPLCNet_x0_35"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_35_infer.tar
tar
xf PPLCNet_x0_35_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPLCNet_x0_5"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_5_infer.tar
tar
xf PPLCNet_x0_5_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPLCNet_x0_75"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_75_infer.tar
tar
xf PPLCNet_x0_75_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPLCNet_x1_0"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x1_0_infer.tar
tar
xf PPLCNet_x1_0_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPLCNet_x1_5"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x1_5_infer.tar
tar
xf PPLCNet_x1_5_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPLCNet_x2_0"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x2_0_infer.tar
tar
xf PPLCNet_x2_0_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPLCNet_x2_5"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x2_5_infer.tar
tar
xf PPLCNet_x2_5_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PP-ShiTu_general_rec"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/models/inference/general_PPLCNet_x2_5_lite_v1.0_infer.tar
tar
xf general_PPLCNet_x2_5_lite_v1.0_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PP-ShiTu_mainbody_det"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/models/inference/picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer.tar
tar
xf picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPLCNetV2_base"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNetV2_base_infer.tar
tar
xf PPLCNetV2_base_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPHGNet_tiny"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPHGNet_tiny_infer.tar
tar
xf PPHGNet_tiny_infer.tar
cd
../../
fi
if
[
${
model_name
}
==
"PPHGNet_small"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPHGNet_small_infer.tar
tar
xf PPHGNet_small_infer.tar
cd
../../
fi
cd
deploy
mkdir
models
cd
models
wget
-nc
${
inference_model_url
}
tar
xf
${
tar_name
}
cd
../../
fi
...
...
test_tipc/test_paddle2onnx.sh
浏览文件 @
17401161
...
...
@@ -11,7 +11,7 @@ python=$(func_parser_value "${lines[2]}")
# parser params
dataline
=
$(
awk
'NR==1, NR==1
5
{print}'
$FILENAME
)
dataline
=
$(
awk
'NR==1, NR==1
6
{print}'
$FILENAME
)
IFS
=
$'
\n
'
lines
=(
${
dataline
}
)
...
...
@@ -32,17 +32,17 @@ opset_version_value=$(func_parser_value "${lines[8]}")
enable_onnx_checker_key
=
$(
func_parser_key
"
${
lines
[9]
}
"
)
enable_onnx_checker_value
=
$(
func_parser_value
"
${
lines
[9]
}
"
)
# parser onnx inference
inference_py
=
$(
func_parser_value
"
${
lines
[1
0
]
}
"
)
use_onnx_key
=
$(
func_parser_key
"
${
lines
[1
1
]
}
"
)
use_onnx_value
=
$(
func_parser_value
"
${
lines
[1
1
]
}
"
)
inference_model_dir_key
=
$(
func_parser_key
"
${
lines
[1
2
]
}
"
)
inference_model_dir_value
=
$(
func_parser_value
"
${
lines
[1
2
]
}
"
)
inference_hardware_key
=
$(
func_parser_key
"
${
lines
[1
3
]
}
"
)
inference_hardware_value
=
$(
func_parser_value
"
${
lines
[1
3
]
}
"
)
inference_config_key
=
$(
func_parser_key
"
${
lines
[1
4
]
}
"
)
inference_config_value
=
$(
func_parser_value
"
${
lines
[1
4
]
}
"
)
inference_py
=
$(
func_parser_value
"
${
lines
[1
1
]
}
"
)
use_onnx_key
=
$(
func_parser_key
"
${
lines
[1
2
]
}
"
)
use_onnx_value
=
$(
func_parser_value
"
${
lines
[1
2
]
}
"
)
inference_model_dir_key
=
$(
func_parser_key
"
${
lines
[1
3
]
}
"
)
inference_model_dir_value
=
$(
func_parser_value
"
${
lines
[1
3
]
}
"
)
inference_hardware_key
=
$(
func_parser_key
"
${
lines
[1
4
]
}
"
)
inference_hardware_value
=
$(
func_parser_value
"
${
lines
[1
4
]
}
"
)
inference_config_key
=
$(
func_parser_key
"
${
lines
[1
5
]
}
"
)
inference_config_value
=
$(
func_parser_value
"
${
lines
[1
5
]
}
"
)
LOG_PATH
=
"./test_tipc/output"
LOG_PATH
=
"./test_tipc/output
/
${
model_name
}
"
mkdir
-p
./test_tipc/output
status_log
=
"
${
LOG_PATH
}
/results_paddle2onnx.log"
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录