Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
PaddleClas
提交
e7d9ba58
P
PaddleClas
项目概览
PaddlePaddle
/
PaddleClas
1 年多 前同步成功
通知
116
Star
4999
Fork
1114
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
19
列表
看板
标记
里程碑
合并请求
6
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleClas
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
19
Issue
19
列表
看板
标记
里程碑
合并请求
6
合并请求
6
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
e7d9ba58
编写于
6月 24, 2022
作者:
H
HydrogenSulfate
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
update config and script
上级
6ecaaba9
变更
9
隐藏空白更改
内联
并排
Showing
9 changed file
with
62 addition
and
82 deletion
+62
-82
test_tipc/config/GeneralRecognition/GeneralRecognition_PPLCNet_x2_5_train_ptq_infer_python.txt
...eneralRecognition_PPLCNet_x2_5_train_ptq_infer_python.txt
+2
-2
test_tipc/config/MobileNetV3/MobileNetV3_large_x1_0_train_ptq_infer_python.txt
...leNetV3/MobileNetV3_large_x1_0_train_ptq_infer_python.txt
+3
-3
test_tipc/config/PPHGNet/PPHGNet_small_train_ptq_infer_python.txt
...c/config/PPHGNet/PPHGNet_small_train_ptq_infer_python.txt
+8
-8
test_tipc/config/PPLCNet/PPLCNet_x1_0_train_ptq_infer_python.txt
...pc/config/PPLCNet/PPLCNet_x1_0_train_ptq_infer_python.txt
+3
-3
test_tipc/config/PPLCNetV2/PPLCNetV2_base_train_ptq_infer_python.txt
...onfig/PPLCNetV2/PPLCNetV2_base_train_ptq_infer_python.txt
+8
-8
test_tipc/config/ResNet/ResNet50_vd_train_ptq_infer_python.txt
...tipc/config/ResNet/ResNet50_vd_train_ptq_infer_python.txt
+3
-3
test_tipc/config/SwinTransformer/SwinTransformer_tiny_patch4_window7_224_train_ptq_infer_python.txt
...former_tiny_patch4_window7_224_train_ptq_infer_python.txt
+3
-3
test_tipc/prepare.sh
test_tipc/prepare.sh
+25
-10
test_tipc/test_train_inference_python.sh
test_tipc/test_train_inference_python.sh
+7
-42
未找到文件。
test_tipc/config/GeneralRecognition/GeneralRecognition_PPLCNet_x2_5_train_ptq_infer_python.txt
浏览文件 @
e7d9ba58
...
...
@@ -34,7 +34,7 @@ distill_export:null
kl_quant:deploy/slim/quant_post_static.py -c ppcls/configs/GeneralRecognition/GeneralRecognition_PPLCNet_x2_5.yaml -o Global.save_inference_dir=./general_PPLCNet_x2_5_lite_v1.0_infer
export2:null
pretrained_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/rec/models/inference/general_PPLCNet_x2_5_lite_v1.0_infer.tar
infer_model:.
./inference
/
infer_model:.
/general_PPLCNet_x2_5_lite_v1.0_infer
/
infer_export:True
infer_quant:Fasle
inference:python/predict_rec.py -c configs/inference_rec.yaml
...
...
@@ -47,7 +47,7 @@ inference:python/predict_rec.py -c configs/inference_rec.yaml
-o Global.rec_inference_model_dir:../inference
-o Global.infer_imgs:../dataset/Aliproduct/demo_test/
-o Global.save_log_path:null
-o Global.benchmark:
Tru
e
-o Global.benchmark:
Fals
e
null:null
null:null
===========================infer_benchmark_params==========================
...
...
test_tipc/config/MobileNetV3/MobileNetV3_large_x1_0_train_ptq_infer_python.txt
浏览文件 @
e7d9ba58
...
...
@@ -34,7 +34,7 @@ distill_export:null
kl_quant:deploy/slim/quant_post_static.py -c ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x1_0.yaml -o Global.save_inference_dir=./MobileNetV3_large_x1_0_infer
export2:null
pretrained_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/MobileNetV3_large_x1_0_infer.tar
infer_model:.
./inference
/
infer_model:.
/MobileNetV3_large_x1_0_infer
/
infer_export:True
infer_quant:Fasle
inference:python/predict_cls.py -c configs/inference_cls.yaml
...
...
@@ -45,9 +45,9 @@ inference:python/predict_cls.py -c configs/inference_cls.yaml
-o Global.use_tensorrt:False
-o Global.use_fp16:False
-o Global.inference_model_dir:../inference
-o Global.infer_imgs:../d
ataset/ILSVRC2012/val
-o Global.infer_imgs:../d
eploy/images/ImageNet/ILSVRC2012_val_00000010.jpeg
-o Global.save_log_path:null
-o Global.benchmark:
Tru
e
-o Global.benchmark:
Fals
e
null:null
null:null
===========================train_benchmark_params==========================
...
...
test_tipc/config/PPHGNet/PPHGNet_small_train_ptq_infer_python.txt
浏览文件 @
e7d9ba58
...
...
@@ -34,20 +34,20 @@ distill_export:null
kl_quant:deploy/slim/quant_post_static.py -c ppcls/configs/ImageNet/PPHGNet/PPHGNet_small.yaml -o Global.save_inference_dir=./PPHGNet_small_infer
export2:null
pretrained_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPHGNet_small_infer.tar
infer_model:.
./inference
/
infer_model:.
/PPHGNet_small_infer
/
infer_export:True
infer_quant:Fasle
inference:python/predict_cls.py -c configs/inference_cls.yaml -o PreProcess.transform_ops.0.ResizeImage.resize_short=236
-o Global.use_gpu:True|False
-o Global.enable_mkldnn:
True|
False
-o Global.cpu_num_threads:1
|6
-o Global.batch_size:1
|16
-o Global.use_tensorrt:
True|
False
-o Global.use_fp16:
True|
False
-o Global.enable_mkldnn:False
-o Global.cpu_num_threads:1
-o Global.batch_size:1
-o Global.use_tensorrt:False
-o Global.use_fp16:False
-o Global.inference_model_dir:../inference
-o Global.infer_imgs:../d
ataset/ILSVRC2012/val
-o Global.infer_imgs:../d
eploy/images/ImageNet/ILSVRC2012_val_00000010.jpeg
-o Global.save_log_path:null
-o Global.benchmark:
Tru
e
-o Global.benchmark:
Fals
e
null:null
===========================infer_benchmark_params==========================
random_infer_input:[{float32,[3,224,224]}]
test_tipc/config/PPLCNet/PPLCNet_x1_0_train_ptq_infer_python.txt
浏览文件 @
e7d9ba58
...
...
@@ -34,7 +34,7 @@ distill_export:null
kl_quant:deploy/slim/quant_post_static.py -c ppcls/configs/ImageNet/PPLCNet/PPLCNet_x1_0.yaml -o Global.save_inference_dir=./PPLCNet_x1_0_infer
export2:null
pretrained_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNet_x1_0_infer.tar
infer_model:.
./inference
/
infer_model:.
/PPLCNet_x1_0_infer
/
infer_export:True
infer_quant:Fasle
inference:python/predict_cls.py -c configs/inference_cls.yaml
...
...
@@ -45,9 +45,9 @@ inference:python/predict_cls.py -c configs/inference_cls.yaml
-o Global.use_tensorrt:False
-o Global.use_fp16:False
-o Global.inference_model_dir:../inference
-o Global.infer_imgs:../d
ataset/ILSVRC2012/val
-o Global.infer_imgs:../d
eploy/images/ImageNet/ILSVRC2012_val_00000010.jpeg
-o Global.save_log_path:null
-o Global.benchmark:
Tru
e
-o Global.benchmark:
Fals
e
null:null
===========================infer_benchmark_params==========================
random_infer_input:[{float32,[3,224,224]}]
\ No newline at end of file
test_tipc/config/PPLCNetV2/PPLCNetV2_base_train_ptq_infer_python.txt
浏览文件 @
e7d9ba58
...
...
@@ -34,20 +34,20 @@ distill_export:null
kl_quant:deploy/slim/quant_post_static.py -c ppcls/configs/ImageNet/PPLCNetV2/PPLCNetV2_base.yaml -o Global.save_inference_dir=./PPLCNetV2_base_infer
export2:null
pretrained_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/PPLCNetV2_base_infer.tar
infer_model:.
./inference
/
infer_model:.
/PPLCNetV2_base_infer
/
infer_export:True
infer_quant:Fasle
inference:python/predict_cls.py -c configs/inference_cls.yaml
-o Global.use_gpu:True|False
-o Global.enable_mkldnn:
True|
False
-o Global.cpu_num_threads:1
|6
-o Global.batch_size:1
|16
-o Global.use_tensorrt:
True|
False
-o Global.use_fp16:
True|
False
-o Global.enable_mkldnn:False
-o Global.cpu_num_threads:1
-o Global.batch_size:1
-o Global.use_tensorrt:False
-o Global.use_fp16:False
-o Global.inference_model_dir:../inference
-o Global.infer_imgs:../d
ataset/ILSVRC2012/val
-o Global.infer_imgs:../d
eploy/images/ImageNet/ILSVRC2012_val_00000010.jpeg
-o Global.save_log_path:null
-o Global.benchmark:
Tru
e
-o Global.benchmark:
Fals
e
null:null
===========================infer_benchmark_params==========================
random_infer_input:[{float32,[3,224,224]}]
test_tipc/config/ResNet/ResNet50_vd_train_ptq_infer_python.txt
浏览文件 @
e7d9ba58
...
...
@@ -34,7 +34,7 @@ distill_export:null
kl_quant:deploy/slim/quant_post_static.py -c ppcls/configs/ImageNet/ResNet/ResNet50_vd.yaml -o Global.save_inference_dir=./ResNet50_vd_infer
export2:null
pretrained_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/ResNet50_vd_infer.tar
infer_model:.
./inference
/
infer_model:.
/ResNet50_vd_infer
/
infer_export:True
infer_quant:Fasle
inference:python/predict_cls.py -c configs/inference_cls.yaml
...
...
@@ -45,9 +45,9 @@ inference:python/predict_cls.py -c configs/inference_cls.yaml
-o Global.use_tensorrt:False
-o Global.use_fp16:False
-o Global.inference_model_dir:../inference
-o Global.infer_imgs:../d
ataset/ILSVRC2012/val
-o Global.infer_imgs:../d
eploy/images/ImageNet/ILSVRC2012_val_00000010.jpeg
-o Global.save_log_path:null
-o Global.benchmark:
Tru
e
-o Global.benchmark:
Fals
e
null:null
null:null
===========================train_benchmark_params==========================
...
...
test_tipc/config/SwinTransformer/SwinTransformer_tiny_patch4_window7_224_train_ptq_infer_python.txt
浏览文件 @
e7d9ba58
...
...
@@ -34,7 +34,7 @@ distill_export:null
kl_quant:deploy/slim/quant_post_static.py -c ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_tiny_patch4_window7_224.yaml -o Global.save_inference_dir=./SwinTransformer_tiny_patch4_window7_224_infer
export2:null
pretrained_model_url:https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/SwinTransformer_tiny_patch4_window7_224_infer.tar
infer_model:.
./inference
/
infer_model:.
/SwinTransformer_tiny_patch4_window7_224_infer
/
infer_export:True
infer_quant:Fasle
inference:python/predict_cls.py -c configs/inference_cls.yaml
...
...
@@ -45,9 +45,9 @@ inference:python/predict_cls.py -c configs/inference_cls.yaml
-o Global.use_tensorrt:False
-o Global.use_fp16:False
-o Global.inference_model_dir:../inference
-o Global.infer_imgs:../d
ataset/ILSVRC2012/val
-o Global.infer_imgs:../d
eploy/images/ImageNet/ILSVRC2012_val_00000010.jpeg
-o Global.save_log_path:null
-o Global.benchmark:
Tru
e
-o Global.benchmark:
Fals
e
null:null
null:null
===========================train_benchmark_params==========================
...
...
test_tipc/prepare.sh
浏览文件 @
e7d9ba58
...
...
@@ -171,17 +171,32 @@ if [[ ${MODE} = "lite_train_lite_infer" ]] || [[ ${MODE} = "lite_train_whole_inf
mv
val.txt val_list.txt
cp
-r
train/
*
val/
cd
../../
elif
[[
${
MODE
}
=
"whole_infer"
]]
||
[[
${
MODE
}
=
"klquant_whole_infer"
]]
;
then
elif
[[
${
MODE
}
=
"whole_infer"
]]
;
then
# download data
cd
dataset
rm
-rf
ILSVRC2012
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/data/whole_chain/whole_chain_infer.tar
tar
xf whole_chain_infer.tar
ln
-s
whole_chain_infer ILSVRC2012
cd
ILSVRC2012
mv
val.txt val_list.txt
ln
-s
val_list.txt train_list.txt
cd
../../
if
[[
${
model_name
}
=
~
"GeneralRecognition"
]]
;
then
cd
dataset
rm
-rf
Aliproduct
rm
-rf
train_reg_all_data.txt
rm
-rf
demo_train
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/data/whole_chain/tipc_shitu_demo_data.tar
--no-check-certificate
tar
-xf
tipc_shitu_demo_data.tar
ln
-s
tipc_shitu_demo_data Aliproduct
ln
-s
tipc_shitu_demo_data/demo_train.txt train_reg_all_data.txt
ln
-s
tipc_shitu_demo_data/demo_train demo_train
cd
tipc_shitu_demo_data
ln
-s
demo_test.txt val_list.txt
cd
../../
else
cd
dataset
rm
-rf
ILSVRC2012
wget
-nc
https://paddle-imagenet-models-name.bj.bcebos.com/data/whole_chain/whole_chain_infer.tar
tar
xf whole_chain_infer.tar
ln
-s
whole_chain_infer ILSVRC2012
cd
ILSVRC2012
mv
val.txt val_list.txt
ln
-s
val_list.txt train_list.txt
cd
../../
fi
# download inference or pretrained model
eval
"wget -nc
$model_url_value
"
if
[[
${
model_url_value
}
=
~
".tar"
]]
;
then
...
...
test_tipc/test_train_inference_python.sh
浏览文件 @
e7d9ba58
...
...
@@ -110,9 +110,6 @@ function func_inference() {
for
use_gpu
in
${
use_gpu_list
[*]
}
;
do
if
[
${
use_gpu
}
=
"False"
]
||
[
${
use_gpu
}
=
"cpu"
]
;
then
for
use_mkldnn
in
${
use_mkldnn_list
[*]
}
;
do
if
[
${
use_mkldnn
}
=
"False"
]
&&
[
${
_flag_quant
}
=
"True"
]
;
then
continue
fi
for
threads
in
${
cpu_threads_list
[*]
}
;
do
for
batch_size
in
${
batch_size_list
[*]
}
;
do
_save_log_path
=
"
${
_log_path
}
/infer_cpu_usemkldnn_
${
use_mkldnn
}
_threads_
${
threads
}
_batchsize_
${
batch_size
}
.log"
...
...
@@ -136,9 +133,6 @@ function func_inference() {
if
[
${
precision
}
=
"True"
]
&&
[
${
use_trt
}
=
"False"
]
;
then
continue
fi
if
[[
${
use_trt
}
=
"False"
||
${
precision
}
=
~
"int8"
]]
&&
[
${
_flag_quant
}
=
"True"
]
;
then
continue
fi
for
batch_size
in
${
batch_size_list
[*]
}
;
do
_save_log_path
=
"
${
_log_path
}
/infer_gpu_usetrt_
${
use_trt
}
_precision_
${
precision
}
_batchsize_
${
batch_size
}
.log"
set_infer_data
=
$(
func_set_params
"
${
image_dir_key
}
"
"
${
_img_dir
}
"
)
...
...
@@ -161,35 +155,6 @@ function func_inference() {
done
}
# if [[ ${MODE} = "whole_infer" ]] || [[ ${MODE} = "klquant_whole_infer" ]]; then
# IFS="|"
# infer_export_flag=(${infer_export_flag})
# if [ ${infer_export_flag} != "null" ] && [ ${infer_export_flag} != "False" ]; then
# rm -rf ${infer_model_dir_list/..\//}
# export_cmd="${python} ${norm_export} -o Global.pretrained_model=${model_name}_pretrained -o Global.save_inference_dir=${infer_model_dir_list/..\//}"
# eval $export_cmd
# fi
# fi
# if [[ ${MODE} = "whole_infer" ]]; then
# GPUID=$3
# if [ ${#GPUID} -le 0 ]; then
# env=" "
# else
# env="export CUDA_VISIBLE_DEVICES=${GPUID}"
# fi
# # set CUDA_VISIBLE_DEVICES
# eval $env
# export Count=0
# cd deploy
# for infer_model in ${infer_model_dir_list[*]}; do
# #run inference
# is_quant=${infer_quant_flag[Count]}
# echo "is_quant: ${is_quant}"
# func_inference "${python}" "${inference_py}" "${infer_model}" "../${LOG_PATH}" "${infer_img_dir}" ${is_quant}
# Count=$(($Count + 1))
# done
# cd ..
if
[[
${
MODE
}
=
"whole_infer"
]]
;
then
# for kl_quant
...
...
@@ -200,13 +165,13 @@ if [[ ${MODE} = "whole_infer" ]]; then
eval
$command
last_status
=
${
PIPESTATUS
[0]
}
status_check
$last_status
"
${
command
}
"
"
${
status_log
}
"
"
${
model_name
}
"
# cd inference
/quant_post_static_model
#
ln -s __model__ inference.pdmodel
#
ln -s __params__ inference.pdiparams
#
cd ../../deploy
#
is_quant=True
# func_inference "${python}" "${inference_py}" "
${infer_model_dir_list}/quant_post_static_model" "../${LOG_PATH}" "${infer_img_dir}" ${is_quant}
#
cd ..
cd
${
infer_model_dir_list
}
/quant_post_static_model
ln
-s
__model__ inference.pdmodel
ln
-s
__params__ inference.pdiparams
cd
../../deploy
is_quant
=
True
func_inference
"
${
python
}
"
"
${
inference_py
}
"
"../
${
infer_model_dir_list
}
/quant_post_static_model"
"../
${
LOG_PATH
}
"
"
${
infer_img_dir
}
"
${
is_quant
}
cd
..
fi
else
IFS
=
"|"
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录