Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
PaddleOCR
提交
300c679b
P
PaddleOCR
项目概览
PaddlePaddle
/
PaddleOCR
大约 1 年 前同步成功
通知
1528
Star
32962
Fork
6643
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
108
列表
看板
标记
里程碑
合并请求
7
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleOCR
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
108
Issue
108
列表
看板
标记
里程碑
合并请求
7
合并请求
7
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
300c679b
编写于
6月 01, 2022
作者:
A
andyjpaddle
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
update_tipc_cpp_infer
上级
d3837d68
变更
8
隐藏空白更改
内联
并排
Showing
8 changed file
with
14 addition
and
14 deletion
+14
-14
test_tipc/configs/ch_PP-OCRv2/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
...model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
+1
-1
test_tipc/configs/ch_PP-OCRv2_rec/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
...model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
+1
-1
test_tipc/configs/ch_ppocr_mobile_v2.0/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
...model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
+1
-1
test_tipc/configs/ch_ppocr_mobile_v2.0_rec/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
...model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
+1
-1
test_tipc/configs/ch_ppocr_server_v2.0/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
...model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
+1
-1
test_tipc/configs/ch_ppocr_server_v2.0_rec/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
...model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
+1
-1
test_tipc/prepare.sh
test_tipc/prepare.sh
+4
-6
test_tipc/test_inference_cpp.sh
test_tipc/test_inference_cpp.sh
+4
-2
未找到文件。
test_tipc/configs/ch_PP-OCRv2/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
浏览文件 @
300c679b
...
...
@@ -3,7 +3,7 @@ model_name:ch_PP-OCRv2
use_opencv:True
infer_model:./inference/ch_PP-OCRv2_det_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--rec_img_h=32
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
...
...
test_tipc/configs/ch_PP-OCRv2_rec/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
浏览文件 @
300c679b
...
...
@@ -3,7 +3,7 @@ model_name:ch_PP-OCRv2_rec
use_opencv:True
infer_model:./inference/ch_PP-OCRv2_rec_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--rec_img_h=32
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
...
...
test_tipc/configs/ch_ppocr_mobile_v2.0/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
浏览文件 @
300c679b
...
...
@@ -3,7 +3,7 @@ model_name:ch_ppocr_mobile_v2.0
use_opencv:True
infer_model:./inference/ch_ppocr_mobile_v2.0_det_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--rec_img_h=32
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
...
...
test_tipc/configs/ch_ppocr_mobile_v2.0_rec/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
浏览文件 @
300c679b
...
...
@@ -3,7 +3,7 @@ model_name:ch_ppocr_mobile_v2.0_rec
use_opencv:True
infer_model:./inference/ch_ppocr_mobile_v2.0_rec_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--rec_img_h=32
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
...
...
test_tipc/configs/ch_ppocr_server_v2.0/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
浏览文件 @
300c679b
...
...
@@ -3,7 +3,7 @@ model_name:ch_ppocr_server_v2.0
use_opencv:True
infer_model:./inference/ch_ppocr_server_v2.0_det_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--rec_img_h=32
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
...
...
test_tipc/configs/ch_ppocr_server_v2.0_rec/model_linux_gpu_normal_normal_infer_cpp_linux_gpu_cpu.txt
浏览文件 @
300c679b
...
...
@@ -3,7 +3,7 @@ model_name:ch_ppocr_server_v2.0_rec
use_opencv:True
infer_model:./inference/ch_ppocr_server_v2.0_rec_infer/
infer_quant:False
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
inference:./deploy/cpp_infer/build/ppocr --rec_char_dict_path=./ppocr/utils/ppocr_keys_v1.txt
--rec_img_h=32
--use_gpu:True|False
--enable_mkldnn:False
--cpu_threads:6
...
...
test_tipc/prepare.sh
浏览文件 @
300c679b
...
...
@@ -328,7 +328,6 @@ if [ ${MODE} = "klquant_whole_infer" ]; then
cd
./inference
&&
tar
xf rec_inference.tar
&&
tar
xf ch_PP-OCRv2_rec_infer.tar
&&
cd
../
fi
if
[
${
model_name
}
=
"ch_PP-OCRv3_rec_KL"
]
;
then
# TODO check model link
wget
-nc
-P
./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_rec_infer.tar
--no-check-certificate
wget
-nc
-P
./inference/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/rec_inference.tar
--no-check-certificate
wget
-nc
-P
./train_data/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ic15_data.tar
--no-check-certificate
...
...
@@ -341,7 +340,6 @@ if [ ${MODE} = "klquant_whole_infer" ]; then
cd
./inference
&&
tar
xf ch_PP-OCRv2_det_infer.tar
&&
tar
xf ch_det_data_50.tar
&&
cd
../
fi
if
[
${
model_name
}
=
"ch_PP-OCRv3_det_KL"
]
;
then
# TODO check model link
wget
-nc
-P
./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/test/ch_det_data_50.tar
--no-check-certificate
wget
-nc
-P
./inference https://paddleocr.bj.bcebos.com/PP-OCRv3/chinese/ch_PP-OCRv3_det_infer.tar
--no-check-certificate
cd
./inference
&&
tar
xf ch_PP-OCRv3_det_infer.tar
&&
tar
xf ch_det_data_50.tar
&&
cd
../
...
...
@@ -417,9 +415,9 @@ if [ ${MODE} = "serving_infer" ];then
IFS
=
'|'
array
=(
${
python_name_list
}
)
python_name
=
${
array
[0]
}
#
${python_name} -m pip install paddle-serving-server-gpu==0.8.3.post101
#
${python_name} -m pip install paddle_serving_client==0.8.3
#
${python_name} -m pip install paddle-serving-app==0.8.3
${
python_name
}
-m
pip
install
paddle-serving-server-gpu
==
0.8.3.post101
${
python_name
}
-m
pip
install
paddle_serving_client
==
0.8.3
${
python_name
}
-m
pip
install
paddle-serving-app
==
0.8.3
# wget model
if
[[
${
model_name
}
=
~
"ch_ppocr_mobile_v2.0"
]]
;
then
wget
-nc
-P
./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar
--no-check-certificate
...
...
@@ -447,7 +445,7 @@ if [ ${MODE} = "paddle2onnx_infer" ];then
# prepare serving env
python_name
=
$(
func_parser_value
"
${
lines
[2]
}
"
)
${
python_name
}
-m
pip
install
paddle2onnx
${
python_name
}
-m
pip
install
onnxruntime
==
1.4.0
${
python_name
}
-m
pip
install
onnxruntime
# wget model
if
[[
${
model_name
}
=
~
"ch_ppocr_mobile_v2.0"
]]
;
then
wget
-nc
-P
./inference https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_det_infer.tar
--no-check-certificate
...
...
test_tipc/test_inference_cpp.sh
浏览文件 @
300c679b
...
...
@@ -189,6 +189,9 @@ else
wget
-nc
$PADDLEInfer
--no-check-certificate
fi
tar
zxf paddle_inference.tgz
if
[
!
-d
"paddle_inference"
]
;
then
ln
-s
paddle_inference_install_dir paddle_inference
fi
echo
"################### download paddle inference finished ###################"
fi
LIB_DIR
=
$(
pwd
)
/paddle_inference/
...
...
@@ -218,11 +221,10 @@ echo "################### build PaddleOCR demo finished ###################"
# set cuda device
GPUID
=
$2
if
[
${#
GPUID
}
-le
0
]
;
then
env
=
"
"
env
=
"
export CUDA_VISIBLE_DEVICES=0
"
else
env
=
"export CUDA_VISIBLE_DEVICES=
${
GPUID
}
"
fi
set
CUDA_VISIBLE_DEVICES
eval
$env
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录