Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
PaddleClas
提交
6a1acf76
P
PaddleClas
项目概览
PaddlePaddle
/
PaddleClas
大约 1 年 前同步成功
通知
115
Star
4999
Fork
1114
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
19
列表
看板
标记
里程碑
合并请求
6
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleClas
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
19
Issue
19
列表
看板
标记
里程碑
合并请求
6
合并请求
6
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
6a1acf76
编写于
5月 31, 2022
作者:
H
HydrogenSulfate
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
debug
上级
b7cceab7
变更
3
隐藏空白更改
内联
并排
Showing
3 changed file
with
353 addition
and
50 deletion
+353
-50
test_tipc/docs/test_inference_cpp.md
test_tipc/docs/test_inference_cpp.md
+4
-4
test_tipc/prepare.sh
test_tipc/prepare.sh
+340
-45
test_tipc/test_inference_cpp.sh
test_tipc/test_inference_cpp.sh
+9
-1
未找到文件。
test_tipc/docs/test_inference_cpp.md
浏览文件 @
6a1acf76
...
@@ -234,7 +234,7 @@ make -j
...
@@ -234,7 +234,7 @@ make -j
*
可执行以下命令,自动完成上述准备环境中的所需内容
*
可执行以下命令,自动完成上述准备环境中的所需内容
```
shell
```
shell
bash test_tipc/prepare.sh test_tipc/config
s/ResNet50
/ResNet50_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt cpp_infer
bash test_tipc/prepare.sh test_tipc/config
/ResNet
/ResNet50_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt cpp_infer
```
```
### 2.3 功能测试
### 2.3 功能测试
...
@@ -248,14 +248,14 @@ bash test_tipc/test_inference_cpp.sh ${your_params_file}
...
@@ -248,14 +248,14 @@ bash test_tipc/test_inference_cpp.sh ${your_params_file}
以
`ResNet50`
的
`Linux GPU/CPU C++推理测试`
为例,命令如下所示。
以
`ResNet50`
的
`Linux GPU/CPU C++推理测试`
为例,命令如下所示。
```
shell
```
shell
bash test_tipc/test_inference_cpp.sh test_tipc/config
s/ResNet50
/ResNet50_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
bash test_tipc/test_inference_cpp.sh test_tipc/config
/ResNet
/ResNet50_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
```
```
输出结果如下,表示命令运行成功。
输出结果如下,表示命令运行成功。
```
shell
```
shell
Run successfully with
command
- ./deploy/cpp/build/clas_system
-c
./deploy/config
s
/inference_cls.yaml
>
./test_tipc/output/ResNet50/infer_cpp/infer_cpp_use_gpu.log 2>&1
!
Run successfully with
command
- ./deploy/cpp/build/clas_system
-c
./deploy/config/inference_cls.yaml
>
./test_tipc/output/ResNet50/infer_cpp/infer_cpp_use_gpu.log 2>&1
!
Run successfully with
command
- ./deploy/cpp/build/clas_system
-c
./deploy/config
s
/inference_cls.yaml
>
./test_tipc/output/ResNet50/infer_cpp/infer_cpp_use_cpu.log 2>&1
!
Run successfully with
command
- ./deploy/cpp/build/clas_system
-c
./deploy/config/inference_cls.yaml
>
./test_tipc/output/ResNet50/infer_cpp/infer_cpp_use_cpu.log 2>&1
!
```
```
最终log中会打印出结果,如下所示
最终log中会打印出结果,如下所示
...
...
test_tipc/prepare.sh
浏览文件 @
6a1acf76
...
@@ -44,46 +44,207 @@ function func_get_url_file_name(){
...
@@ -44,46 +44,207 @@ function func_get_url_file_name(){
model_name
=
$(
func_parser_value
"
${
lines
[1]
}
"
)
model_name
=
$(
func_parser_value
"
${
lines
[1]
}
"
)
if
[
${
MODE
}
=
"cpp_infer"
]
;
then
if
[
${
MODE
}
=
"cpp_infer"
]
;
then
if
[[
$FILENAME
==
*
infer_cpp_linux_gpu_cpu.txt
]]
;
then
if
[
-d
"./deploy/cpp/opencv-3.4.7/opencv3/"
]
&&
[
$(
md5sum
./deploy/cpp/opencv-3.4.7.tar.gz |
awk
-F
' '
'{print $1}'
)
=
"faa2b5950f8bee3f03118e600c74746a"
]
;
then
cpp_type
=
$(
func_parser_value
"
${
lines
[2]
}
"
)
echo
"################### build opencv skipped ###################"
cls_inference_model_dir
=
$(
func_parser_value
"
${
lines
[3]
}
"
)
else
det_inference_model_dir
=
$(
func_parser_value
"
${
lines
[4]
}
"
)
echo
"################### build opencv ###################"
cls_inference_url
=
$(
func_parser_value
"
${
lines
[5]
}
"
)
rm
-rf
./deploy/cpp/opencv-3.4.7.tar.gz ./deploy/cpp/opencv-3.4.7/
det_inference_url
=
$(
func_parser_value
"
${
lines
[6]
}
"
)
pushd
./deploy/cpp/
wget
-nc
https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/opencv-3.4.7.tar.gz
if
[[
$cpp_type
==
"cls"
]]
;
then
tar
-xf
opencv-3.4.7.tar.gz
eval
"wget -nc
$cls_inference_url
"
tar
xf
"
${
model_name
}
_inference.tar"
cd
opencv-3.4.7/
eval
"mv inference
$cls_inference_model_dir
"
install_path
=
$(
pwd
)
/opencv3
cd
dataset
rm
-rf
build
rm
-rf
ILSVRC2012
mkdir
build
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/data/whole_chain/whole_chain_infer.tar
cd
build
tar
xf whole_chain_infer.tar
ln
-s
whole_chain_infer ILSVRC2012
cmake ..
\
cd
..
-DCMAKE_INSTALL_PREFIX
=
${
install_path
}
\
elif
[[
$cpp_type
==
"shitu"
]]
;
then
-DCMAKE_BUILD_TYPE
=
Release
\
eval
"wget -nc
$cls_inference_url
"
-DBUILD_SHARED_LIBS
=
OFF
\
tar_name
=
$(
func_get_url_file_name
"
$cls_inference_url
"
)
-DWITH_IPP
=
OFF
\
model_dir
=
${
tar_name
%.*
}
-DBUILD_IPP_IW
=
OFF
\
eval
"tar xf
${
tar_name
}
"
-DWITH_LAPACK
=
OFF
\
eval
"mv
${
model_dir
}
${
cls_inference_model_dir
}
"
-DWITH_EIGEN
=
OFF
\
-DCMAKE_INSTALL_LIBDIR
=
lib64
\
eval
"wget -nc
$det_inference_url
"
-DWITH_ZLIB
=
ON
\
tar_name
=
$(
func_get_url_file_name
"
$det_inference_url
"
)
-DBUILD_ZLIB
=
ON
\
model_dir
=
${
tar_name
%.*
}
-DWITH_JPEG
=
ON
\
eval
"tar xf
${
tar_name
}
"
-DBUILD_JPEG
=
ON
\
eval
"mv
${
model_dir
}
${
det_inference_model_dir
}
"
-DWITH_PNG
=
ON
\
cd
dataset
-DBUILD_PNG
=
ON
\
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/data/drink_dataset_v1.0.tar
-DWITH_TIFF
=
ON
\
tar
-xf
drink_dataset_v1.0.tar
-DBUILD_TIFF
=
ON
else
echo
"Wrong cpp type in config file in line 3. only support cls, shitu"
make
-j
fi
make
install
exit
0
cd
../../
else
popd
echo
"use wrong config file"
echo
"################### build opencv finished ###################"
exit
1
fi
fi
set_OPENCV_DIR_cmd
=
"sed -i '1s#OPENCV_DIR=.*#OPENCV_DIR=../opencv-3.4.7/opencv3/#' './deploy/cpp/tools/build.sh'"
eval
${
set_OPENCV_DIR_cmd
}
if
[
-d
"./deploy/cpp/paddle_inference/"
]
;
then
echo
"################### build paddle inference lib skipped ###################"
else
pushd
./deploy/cpp/
wget https://paddle-inference-lib.bj.bcebos.com/2.1.1-gpu-cuda10.2-cudnn8.1-mkl-gcc8.2/paddle_inference.tgz
tar
-xvf
paddle_inference.tgz
echo
"################### build paddle inference lib finished ###################"
fi
set_LIB_DIR_cmd
=
"sed -i '2s#LIB_DIR=.*#LIB_DIR=../paddle_inference/#' './deploy/cpp/tools/build.sh'"
# echo ${set_LIB_DIR_cmd}
eval
${
set_LIB_DIR_cmd
}
# exit
if
[
-d
"./deploy/cpp/build/"
]
;
then
echo
"################### build cpp inference skipped ###################"
else
pushd
./deploy/cpp/
bash tools/build.sh
popd
echo
"################### build cpp inference finished ###################"
fi
if
[
${
model_name
}
==
"ResNet50"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/ResNet50_infer.tar
tar
xf ResNet50_infer.tar
cd
../../
elif
[
${
model_name
}
==
"ResNet50_vd"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/ResNet50_vd_infer.tar
tar
xf ResNet50_vd_infer.tar
cd
../../
elif
[
${
model_name
}
==
"MobileNetV3_large_x1_0"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/MobileNetV3_large_x1_0_infer.tar
tar
xf MobileNetV3_large_x1_0_infer.tar
cd
../../
elif
[
${
model_name
}
==
"SwinTransformer_tiny_patch4_window7_224"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/SwinTransformer_tiny_patch4_window7_224_infer.tar
tar
xf SwinTransformer_tiny_patch4_window7_224_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x0_25"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_25_infer.tar
tar
xf PPLCNet_x0_25_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x0_35"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_35_infer.tar
tar
xf PPLCNet_x0_35_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x0_5"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_5_infer.tar
tar
xf PPLCNet_x0_5_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x0_75"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_75_infer.tar
tar
xf PPLCNet_x0_75_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x1_0"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x1_0_infer.tar
tar
xf PPLCNet_x1_0_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x1_5"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x1_5_infer.tar
tar
xf PPLCNet_x1_5_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x2_0"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x2_0_infer.tar
tar
xf PPLCNet_x2_0_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x2_5"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x2_5_infer.tar
tar
xf PPLCNet_x2_5_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PP-ShiTu_general_rec"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/models/inference/general_PPLCNet_x2_5_lite_v1.0_infer.tar
tar
xf general_PPLCNet_x2_5_lite_v1.0_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PP-ShiTu_mainbody_det"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/models/inference/picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer.tar
tar
xf picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNetV2_base"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNetV2_base_infer.tar
tar
xf PPLCNetV2_base_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPHGNet_tiny"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPHGNet_tiny_infer.tar
tar
xf PPHGNet_tiny_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPHGNet_small"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPHGNet_small_infer.tar
tar
xf PPHGNet_small_infer.tar
cd
../../
else
echo
"Not added into TIPC yet."
fi
fi
fi
model_name
=
$(
func_parser_value
"
${
lines
[1]
}
"
)
model_name
=
$(
func_parser_value
"
${
lines
[1]
}
"
)
...
@@ -176,11 +337,145 @@ if [ ${MODE} = "paddle2onnx_infer" ];then
...
@@ -176,11 +337,145 @@ if [ ${MODE} = "paddle2onnx_infer" ];then
python_name
=
$(
func_parser_value
"
${
lines
[2]
}
"
)
python_name
=
$(
func_parser_value
"
${
lines
[2]
}
"
)
${
python_name
}
-m
pip
install install
paddle2onnx
${
python_name
}
-m
pip
install install
paddle2onnx
${
python_name
}
-m
pip
install
onnxruntime
${
python_name
}
-m
pip
install
onnxruntime
if
[
${
model_name
}
==
"ResNet50"
]
;
then
# wget model
# wget model
cd
deploy
&&
mkdir
models
&&
cd
models
cd
deploy
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/ResNet50_vd_infer.tar
&&
tar
xf ResNet50_vd_infer.tar
mkdir
models
cd
../../
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/ResNet50_infer.tar
tar
xf ResNet50_infer.tar
cd
../../
elif
[
${
model_name
}
==
"ResNet50_vd"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/ResNet50_vd_infer.tar
tar
xf ResNet50_vd_infer.tar
cd
../../
elif
[
${
model_name
}
==
"MobileNetV3_large_x1_0"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/MobileNetV3_large_x1_0_infer.tar
tar
xf MobileNetV3_large_x1_0_infer.tar
cd
../../
elif
[
${
model_name
}
==
"SwinTransformer_tiny_patch4_window7_224"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/SwinTransformer_tiny_patch4_window7_224_infer.tar
tar
xf SwinTransformer_tiny_patch4_window7_224_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x0_25"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_25_infer.tar
tar
xf PPLCNet_x0_25_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x0_35"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_35_infer.tar
tar
xf PPLCNet_x0_35_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x0_5"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_5_infer.tar
tar
xf PPLCNet_x0_5_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x0_75"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x0_75_infer.tar
tar
xf PPLCNet_x0_75_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x1_0"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x1_0_infer.tar
tar
xf PPLCNet_x1_0_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x1_5"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x1_5_infer.tar
tar
xf PPLCNet_x1_5_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x2_0"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x2_0_infer.tar
tar
xf PPLCNet_x2_0_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNet_x2_5"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x2_5_infer.tar
tar
xf PPLCNet_x2_5_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PP-ShiTu_general_rec"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/models/inference/general_PPLCNet_x2_5_lite_v1.0_infer.tar
tar
xf general_PPLCNet_x2_5_lite_v1.0_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PP-ShiTu_mainbody_det"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/models/inference/picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer.tar
tar
xf picodet_PPLCNet_x2_5_mainbody_lite_v1.0_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPLCNetV2_base"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNetV2_base_infer.tar
tar
xf PPLCNetV2_base_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPHGNet_tiny"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPHGNet_tiny_infer.tar
tar
xf PPHGNet_tiny_infer.tar
cd
../../
elif
[
${
model_name
}
==
"PPHGNet_small"
]
;
then
# wget model
cd
deploy
mkdir
models
cd
models
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPHGNet_small_infer.tar
tar
xf PPHGNet_small_infer.tar
cd
../../
else
echo
"Not added into TIPC yet."
fi
fi
fi
if
[
${
MODE
}
=
"benchmark_train"
]
;
then
if
[
${
MODE
}
=
"benchmark_train"
]
;
then
...
...
test_tipc/test_inference_cpp.sh
浏览文件 @
6a1acf76
...
@@ -37,6 +37,7 @@ status_log="${LOG_PATH}/results_infer_cpp.log"
...
@@ -37,6 +37,7 @@ status_log="${LOG_PATH}/results_infer_cpp.log"
line_inference_model_dir
=
3
line_inference_model_dir
=
3
line_use_gpu
=
5
line_use_gpu
=
5
line_infer_imgs
=
2
function
func_infer_cpp
(){
function
func_infer_cpp
(){
# inference cpp
# inference cpp
IFS
=
'|'
IFS
=
'|'
...
@@ -49,12 +50,19 @@ function func_infer_cpp(){
...
@@ -49,12 +50,19 @@ function func_infer_cpp(){
# run infer cpp
# run infer cpp
inference_cpp_cmd
=
"./deploy/cpp/build/clas_system"
inference_cpp_cmd
=
"./deploy/cpp/build/clas_system"
inference_cpp_cfg
=
"./deploy/configs/inference_cls.yaml"
inference_cpp_cfg
=
"./deploy/configs/inference_cls.yaml"
set_model_name_cmd
=
"sed -i '
${
line_inference_model_dir
}
s#: .*#: ./deploy/models/
${
model_name
}
_infer#' '
${
inference_cpp_cfg
}
'"
set_model_name_cmd
=
"sed -i '
${
line_inference_model_dir
}
s#: .*#: ./deploy/models/
${
model_name
}
_infer#' '
${
inference_cpp_cfg
}
'"
set_use_gpu_cmd
=
"sed -i '
${
line_use_gpu
}
s#: .*#:
${
use_gpu
}
#' '
${
inference_cpp_cfg
}
'"
eval
$set_model_name_cmd
eval
$set_model_name_cmd
set_infer_imgs_cmd
=
"sed -i '
${
line_infer_imgs
}
s#: .*#: ./deploy/images/ILSVRC2012_val_00000010.jpeg#' '
${
inference_cpp_cfg
}
'"
eval
$set_infer_imgs_cmd
set_use_gpu_cmd
=
"sed -i '
${
line_use_gpu
}
s#: .*#:
${
use_gpu
}
#' '
${
inference_cpp_cfg
}
'"
eval
$set_use_gpu_cmd
eval
$set_use_gpu_cmd
infer_cpp_full_cmd
=
"
${
inference_cpp_cmd
}
-c
${
inference_cpp_cfg
}
>
${
_save_log_path
}
2>&1 "
infer_cpp_full_cmd
=
"
${
inference_cpp_cmd
}
-c
${
inference_cpp_cfg
}
>
${
_save_log_path
}
2>&1 "
eval
$infer_cpp_full_cmd
eval
$infer_cpp_full_cmd
last_status
=
${
PIPESTATUS
[0]
}
last_status
=
${
PIPESTATUS
[0]
}
status_check
$last_status
"
${
infer_cpp_full_cmd
}
"
"
${
status_log
}
"
"
${
model_name
}
"
status_check
$last_status
"
${
infer_cpp_full_cmd
}
"
"
${
status_log
}
"
"
${
model_name
}
"
done
done
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录