Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
PaddleClas
提交
55a066ed
P
PaddleClas
项目概览
PaddlePaddle
/
PaddleClas
大约 1 年 前同步成功
通知
115
Star
4999
Fork
1114
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
19
列表
看板
标记
里程碑
合并请求
6
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleClas
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
19
Issue
19
列表
看板
标记
里程碑
合并请求
6
合并请求
6
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
55a066ed
编写于
6月 10, 2022
作者:
H
HydrogenSulfate
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
refine prepare.sh and test_paddle2onnx.sh
上级
17055798
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
4 addition
and
4 deletion
+4
-4
test_tipc/prepare.sh
test_tipc/prepare.sh
+2
-2
test_tipc/test_paddle2onnx.sh
test_tipc/test_paddle2onnx.sh
+2
-2
未找到文件。
test_tipc/prepare.sh
浏览文件 @
55a066ed
...
@@ -200,7 +200,7 @@ fi
...
@@ -200,7 +200,7 @@ fi
if
[[
${
MODE
}
=
"serving_infer"
]]
;
then
if
[[
${
MODE
}
=
"serving_infer"
]]
;
then
# prepare serving env
# prepare serving env
python_name
=
$(
func_parser_value
"
${
lines
[2]
}
"
)
python_name
=
$(
func_parser_value
"
${
lines
[2]
}
"
)
${
python_name
}
-m
pip
install
install
paddle-serving-server-gpu
==
0.7.0.post102
${
python_name
}
-m
pip
install
paddle-serving-server-gpu
==
0.7.0.post102
${
python_name
}
-m
pip
install
paddle_serving_client
==
0.7.0
${
python_name
}
-m
pip
install
paddle_serving_client
==
0.7.0
${
python_name
}
-m
pip
install
paddle-serving-app
==
0.7.0
${
python_name
}
-m
pip
install
paddle-serving-app
==
0.7.0
if
[[
${
model_name
}
=
~
"ShiTu"
]]
;
then
if
[[
${
model_name
}
=
~
"ShiTu"
]]
;
then
...
@@ -231,7 +231,7 @@ if [[ ${MODE} = "paddle2onnx_infer" ]]; then
...
@@ -231,7 +231,7 @@ if [[ ${MODE} = "paddle2onnx_infer" ]]; then
inference_model_url
=
$(
func_parser_value
"
${
lines
[10]
}
"
)
inference_model_url
=
$(
func_parser_value
"
${
lines
[10]
}
"
)
tar_name
=
${
inference_model_url
##*/
}
tar_name
=
${
inference_model_url
##*/
}
${
python_name
}
-m
pip
install
install
paddle2onnx
${
python_name
}
-m
pip
install
paddle2onnx
${
python_name
}
-m
pip
install
onnxruntime
${
python_name
}
-m
pip
install
onnxruntime
cd
deploy
cd
deploy
mkdir
models
mkdir
models
...
...
test_tipc/test_paddle2onnx.sh
浏览文件 @
55a066ed
...
@@ -55,7 +55,7 @@ function func_paddle2onnx(){
...
@@ -55,7 +55,7 @@ function func_paddle2onnx(){
trans_model_cmd
=
"
${
padlle2onnx_cmd
}
${
set_dirname
}
${
set_model_filename
}
${
set_params_filename
}
${
set_save_model
}
${
set_opset_version
}
${
set_enable_onnx_checker
}
"
trans_model_cmd
=
"
${
padlle2onnx_cmd
}
${
set_dirname
}
${
set_model_filename
}
${
set_params_filename
}
${
set_save_model
}
${
set_opset_version
}
${
set_enable_onnx_checker
}
"
eval
$trans_model_cmd
eval
$trans_model_cmd
last_status
=
${
PIPESTATUS
[0]
}
last_status
=
${
PIPESTATUS
[0]
}
status_check
$last_status
"
${
trans_model_cmd
}
"
"
${
status_log
}
"
"
${
model_name
}
"
"
${
model_name
}
"
status_check
$last_status
"
${
trans_model_cmd
}
"
"
${
status_log
}
"
"
${
model_name
}
"
# python inference
# python inference
set_model_dir
=
$(
func_set_params
"
${
inference_model_dir_key
}
"
"
${
inference_model_dir_value
}
"
)
set_model_dir
=
$(
func_set_params
"
${
inference_model_dir_key
}
"
"
${
inference_model_dir_value
}
"
)
...
@@ -64,7 +64,7 @@ function func_paddle2onnx(){
...
@@ -64,7 +64,7 @@ function func_paddle2onnx(){
set_inference_config
=
$(
func_set_params
"
${
inference_config_key
}
"
"
${
inference_config_value
}
"
)
set_inference_config
=
$(
func_set_params
"
${
inference_config_key
}
"
"
${
inference_config_value
}
"
)
infer_model_cmd
=
"cd deploy &&
${
python
}
${
inference_py
}
-o
${
set_model_dir
}
-o
${
set_use_onnx
}
-o
${
set_hardware
}
${
set_inference_config
}
>
${
_save_log_path
}
2>&1 && cd ../"
infer_model_cmd
=
"cd deploy &&
${
python
}
${
inference_py
}
-o
${
set_model_dir
}
-o
${
set_use_onnx
}
-o
${
set_hardware
}
${
set_inference_config
}
>
${
_save_log_path
}
2>&1 && cd ../"
eval
$infer_model_cmd
eval
$infer_model_cmd
status_check
$last_status
"
${
infer_model_cmd
}
"
"
${
status_log
}
"
"
${
model_name
}
"
"
${
model_name
}
"
status_check
$last_status
"
${
infer_model_cmd
}
"
"
${
status_log
}
"
"
${
model_name
}
"
}
}
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录