Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
PaddleX
提交
7a178866
P
PaddleX
项目概览
PaddlePaddle
/
PaddleX
通知
138
Star
4
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
43
列表
看板
标记
里程碑
合并请求
5
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleX
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
43
Issue
43
列表
看板
标记
里程碑
合并请求
5
合并请求
5
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
提交
7a178866
编写于
8月 28, 2020
作者:
S
syyxsxx
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
change cfg_dir to cfg_file
上级
d1866a90
变更
18
隐藏空白更改
内联
并排
Showing
18 changed file
with
95 addition
and
117 deletion
+95
-117
deploy/openvino/demo/classifier.cpp
deploy/openvino/demo/classifier.cpp
+4
-4
deploy/openvino/demo/detector.cpp
deploy/openvino/demo/detector.cpp
+4
-4
deploy/openvino/demo/segmenter.cpp
deploy/openvino/demo/segmenter.cpp
+4
-4
deploy/openvino/include/paddlex/paddlex.h
deploy/openvino/include/paddlex/paddlex.h
+3
-3
deploy/openvino/python/demo.py
deploy/openvino/python/demo.py
+11
-20
deploy/openvino/scripts/install_third-party.sh
deploy/openvino/scripts/install_third-party.sh
+1
-1
deploy/openvino/src/paddlex.cpp
deploy/openvino/src/paddlex.cpp
+4
-4
deploy/raspberry/demo/classifier.cpp
deploy/raspberry/demo/classifier.cpp
+4
-4
deploy/raspberry/demo/detector.cpp
deploy/raspberry/demo/detector.cpp
+4
-4
deploy/raspberry/demo/segmenter.cpp
deploy/raspberry/demo/segmenter.cpp
+4
-4
deploy/raspberry/include/paddlex/paddlex.h
deploy/raspberry/include/paddlex/paddlex.h
+3
-3
deploy/raspberry/python/demo.py
deploy/raspberry/python/demo.py
+10
-21
deploy/raspberry/src/paddlex.cpp
deploy/raspberry/src/paddlex.cpp
+4
-4
docs/deploy/openvino/linux.md
docs/deploy/openvino/linux.md
+11
-11
docs/deploy/openvino/python.md
docs/deploy/openvino/python.md
+4
-6
docs/deploy/openvino/windows.md
docs/deploy/openvino/windows.md
+5
-5
docs/deploy/raspberry/Raspberry.md
docs/deploy/raspberry/Raspberry.md
+10
-10
docs/deploy/raspberry/python.md
docs/deploy/raspberry/python.md
+5
-5
未找到文件。
deploy/openvino/demo/classifier.cpp
浏览文件 @
7a178866
...
...
@@ -22,7 +22,7 @@
#include "include/paddlex/paddlex.h"
DEFINE_string
(
model_dir
,
""
,
"Path of inference model"
);
DEFINE_string
(
cfg_
dir
,
""
,
"Path of PaddelX model yml file"
);
DEFINE_string
(
cfg_
file
,
""
,
"Path of PaddelX model yml file"
);
DEFINE_string
(
device
,
"CPU"
,
"Device name"
);
DEFINE_string
(
image
,
""
,
"Path of test image file"
);
DEFINE_string
(
image_list
,
""
,
"Path of test image list file"
);
...
...
@@ -35,8 +35,8 @@ int main(int argc, char** argv) {
std
::
cerr
<<
"--model_dir need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_cfg_
dir
==
""
)
{
std
::
cerr
<<
"--cfg_
dir
need to be defined"
<<
std
::
endl
;
if
(
FLAGS_cfg_
file
==
""
)
{
std
::
cerr
<<
"--cfg_
file
need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_image
==
""
&
FLAGS_image_list
==
""
)
{
...
...
@@ -46,7 +46,7 @@ int main(int argc, char** argv) {
// 加载模型
PaddleX
::
Model
model
;
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
dir
,
FLAGS_device
);
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
file
,
FLAGS_device
);
// 进行预测
if
(
FLAGS_image_list
!=
""
)
{
...
...
deploy/openvino/demo/detector.cpp
浏览文件 @
7a178866
...
...
@@ -29,7 +29,7 @@
using
namespace
std
::
chrono
;
// NOLINT
DEFINE_string
(
model_dir
,
""
,
"Path of openvino model xml file"
);
DEFINE_string
(
cfg_
dir
,
""
,
"Path of PaddleX model yaml file"
);
DEFINE_string
(
cfg_
file
,
""
,
"Path of PaddleX model yaml file"
);
DEFINE_string
(
image
,
""
,
"Path of test image file"
);
DEFINE_string
(
image_list
,
""
,
"Path of test image list file"
);
DEFINE_string
(
device
,
"CPU"
,
"Device name"
);
...
...
@@ -45,8 +45,8 @@ int main(int argc, char** argv) {
std
::
cerr
<<
"--model_dir need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_cfg_
dir
==
""
)
{
std
::
cerr
<<
"--cfg_
dir
need to be defined"
<<
std
::
endl
;
if
(
FLAGS_cfg_
file
==
""
)
{
std
::
cerr
<<
"--cfg_
file
need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_image
==
""
&
FLAGS_image_list
==
""
)
{
...
...
@@ -56,7 +56,7 @@ int main(int argc, char** argv) {
//
PaddleX
::
Model
model
;
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
dir
,
FLAGS_device
);
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
file
,
FLAGS_device
);
int
imgs
=
1
;
auto
colormap
=
PaddleX
::
GenerateColorMap
(
model
.
labels
.
size
());
...
...
deploy/openvino/demo/segmenter.cpp
浏览文件 @
7a178866
...
...
@@ -25,7 +25,7 @@
DEFINE_string
(
model_dir
,
""
,
"Path of openvino model xml file"
);
DEFINE_string
(
cfg_
dir
,
""
,
"Path of PaddleX model yaml file"
);
DEFINE_string
(
cfg_
file
,
""
,
"Path of PaddleX model yaml file"
);
DEFINE_string
(
image
,
""
,
"Path of test image file"
);
DEFINE_string
(
image_list
,
""
,
"Path of test image list file"
);
DEFINE_string
(
device
,
"CPU"
,
"Device name"
);
...
...
@@ -39,8 +39,8 @@ int main(int argc, char** argv) {
std
::
cerr
<<
"--model_dir need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_cfg_
dir
==
""
)
{
std
::
cerr
<<
"--cfg_
dir
need to be defined"
<<
std
::
endl
;
if
(
FLAGS_cfg_
file
==
""
)
{
std
::
cerr
<<
"--cfg_
file
need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_image
==
""
&
FLAGS_image_list
==
""
)
{
...
...
@@ -51,7 +51,7 @@ int main(int argc, char** argv) {
//
std
::
cout
<<
"init start"
<<
std
::
endl
;
PaddleX
::
Model
model
;
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
dir
,
FLAGS_device
);
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
file
,
FLAGS_device
);
std
::
cout
<<
"init done"
<<
std
::
endl
;
int
imgs
=
1
;
auto
colormap
=
PaddleX
::
GenerateColorMap
(
model
.
labels
.
size
());
...
...
deploy/openvino/include/paddlex/paddlex.h
浏览文件 @
7a178866
...
...
@@ -39,13 +39,13 @@ namespace PaddleX {
class
Model
{
public:
void
Init
(
const
std
::
string
&
model_dir
,
const
std
::
string
&
cfg_
dir
,
const
std
::
string
&
cfg_
file
,
std
::
string
device
)
{
create_predictor
(
model_dir
,
cfg_
dir
,
device
);
create_predictor
(
model_dir
,
cfg_
file
,
device
);
}
void
create_predictor
(
const
std
::
string
&
model_dir
,
const
std
::
string
&
cfg_
dir
,
const
std
::
string
&
cfg_
file
,
std
::
string
device
);
bool
load_config
(
const
std
::
string
&
model_dir
);
...
...
deploy/openvino/python/demo.py
浏览文件 @
7a178866
...
...
@@ -32,30 +32,20 @@ def arg_parser():
type
=
str
,
default
=
'CPU'
,
help
=
"Specify the target device to infer on:[CPU, GPU, FPGA, HDDL, MYRIAD,HETERO]"
"Default value is CPU"
)
"Default value is CPU"
)
parser
.
add_argument
(
"--img"
,
"-i"
,
type
=
str
,
default
=
None
,
help
=
"path to an image files"
)
"--img"
,
"-i"
,
type
=
str
,
default
=
None
,
help
=
"path to an image files"
)
parser
.
add_argument
(
"--img_list"
,
"-l"
,
type
=
str
,
default
=
None
,
help
=
"Path to a imglist"
)
"--img_list"
,
"-l"
,
type
=
str
,
default
=
None
,
help
=
"Path to a imglist"
)
parser
.
add_argument
(
"--cfg_
dir
"
,
"--cfg_
file
"
,
"-c"
,
type
=
str
,
default
=
None
,
help
=
"Path to PaddelX model yml file"
)
return
parser
...
...
@@ -63,16 +53,16 @@ def main():
parser
=
arg_parser
()
args
=
parser
.
parse_args
()
model_xml
=
args
.
model_dir
model_yaml
=
args
.
cfg_
dir
model_yaml
=
args
.
cfg_
file
#model init
if
(
"CPU"
not
in
args
.
device
):
predictor
=
deploy
.
Predictor
(
model_xml
,
model_yaml
,
args
.
device
)
if
(
"CPU"
not
in
args
.
device
):
predictor
=
deploy
.
Predictor
(
model_xml
,
model_yaml
,
args
.
device
)
else
:
predictor
=
deploy
.
Predictor
(
model_xml
,
model_yaml
)
predictor
=
deploy
.
Predictor
(
model_xml
,
model_yaml
)
#predict
if
(
args
.
img_list
!=
None
):
if
(
args
.
img_list
!=
None
):
f
=
open
(
args
.
img_list
)
lines
=
f
.
readlines
()
for
im_path
in
lines
:
...
...
@@ -83,5 +73,6 @@ def main():
im_path
=
args
.
img
predictor
.
predict
(
im_path
)
if
__name__
==
"__main__"
:
main
()
deploy/openvino/scripts/install_third-party.sh
浏览文件 @
7a178866
...
...
@@ -24,7 +24,7 @@ if [ ! -d "./deps/glog" ]; then
fi
if
[
"
$ARCH
"
=
"x86"
]
;
then
OPENCV_URL
=
https://bj.bcebos.com/paddlex/deploy/x86opencv/opencv.tar.bz2
OPENCV_URL
=
https://bj.bcebos.com/paddlex/deploy/x86opencv/opencv.tar.bz2
else
OPENCV_URL
=
https://bj.bcebos.com/paddlex/deploy/armopencv/opencv.tar.bz2
fi
...
...
deploy/openvino/src/paddlex.cpp
浏览文件 @
7a178866
...
...
@@ -20,7 +20,7 @@
namespace
PaddleX
{
void
Model
::
create_predictor
(
const
std
::
string
&
model_dir
,
const
std
::
string
&
cfg_
dir
,
const
std
::
string
&
cfg_
file
,
std
::
string
device
)
{
InferenceEngine
::
Core
ie
;
network_
=
ie
.
ReadNetwork
(
...
...
@@ -49,11 +49,11 @@ void Model::create_predictor(const std::string& model_dir,
}
else
{
executable_network_
=
ie
.
LoadNetwork
(
network_
,
device
);
}
load_config
(
cfg_
dir
);
load_config
(
cfg_
file
);
}
bool
Model
::
load_config
(
const
std
::
string
&
cfg_
dir
)
{
YAML
::
Node
config
=
YAML
::
LoadFile
(
cfg_
dir
);
bool
Model
::
load_config
(
const
std
::
string
&
cfg_
file
)
{
YAML
::
Node
config
=
YAML
::
LoadFile
(
cfg_
file
);
type
=
config
[
"_Attributes"
][
"model_type"
].
as
<
std
::
string
>
();
name
=
config
[
"Model"
].
as
<
std
::
string
>
();
bool
to_rgb
=
true
;
...
...
deploy/raspberry/demo/classifier.cpp
浏览文件 @
7a178866
...
...
@@ -22,7 +22,7 @@
#include "include/paddlex/paddlex.h"
DEFINE_string
(
model_dir
,
""
,
"Path of inference model"
);
DEFINE_string
(
cfg_
dir
,
""
,
"Path of PaddelX model yml file"
);
DEFINE_string
(
cfg_
file
,
""
,
"Path of PaddelX model yml file"
);
DEFINE_string
(
image
,
""
,
"Path of test image file"
);
DEFINE_string
(
image_list
,
""
,
"Path of test image list file"
);
DEFINE_int32
(
thread_num
,
1
,
"num of thread to infer"
);
...
...
@@ -35,8 +35,8 @@ int main(int argc, char** argv) {
std
::
cerr
<<
"--model_dir need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_cfg_
dir
==
""
)
{
std
::
cerr
<<
"--cfg_
dir
need to be defined"
<<
std
::
endl
;
if
(
FLAGS_cfg_
file
==
""
)
{
std
::
cerr
<<
"--cfg_
flie
need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_image
==
""
&
FLAGS_image_list
==
""
)
{
...
...
@@ -46,7 +46,7 @@ int main(int argc, char** argv) {
// 加载模型
PaddleX
::
Model
model
;
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
dir
,
FLAGS_thread_num
);
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
file
,
FLAGS_thread_num
);
std
::
cout
<<
"init is done"
<<
std
::
endl
;
// 进行预测
if
(
FLAGS_image_list
!=
""
)
{
...
...
deploy/raspberry/demo/detector.cpp
浏览文件 @
7a178866
...
...
@@ -29,7 +29,7 @@
using
namespace
std
::
chrono
;
// NOLINT
DEFINE_string
(
model_dir
,
""
,
"Path of openvino model xml file"
);
DEFINE_string
(
cfg_
dir
,
""
,
"Path of PaddleX model yaml file"
);
DEFINE_string
(
cfg_
file
,
""
,
"Path of PaddleX model yaml file"
);
DEFINE_string
(
image
,
""
,
"Path of test image file"
);
DEFINE_string
(
image_list
,
""
,
"Path of test image list file"
);
DEFINE_int32
(
thread_num
,
1
,
"num of thread to infer"
);
...
...
@@ -45,8 +45,8 @@ int main(int argc, char** argv) {
std
::
cerr
<<
"--model_dir need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_cfg_
dir
==
""
)
{
std
::
cerr
<<
"--cfg_
dir
need to be defined"
<<
std
::
endl
;
if
(
FLAGS_cfg_
file
==
""
)
{
std
::
cerr
<<
"--cfg_
file
need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_image
==
""
&
FLAGS_image_list
==
""
)
{
...
...
@@ -56,7 +56,7 @@ int main(int argc, char** argv) {
//
PaddleX
::
Model
model
;
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
dir
,
FLAGS_thread_num
);
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
file
,
FLAGS_thread_num
);
int
imgs
=
1
;
auto
colormap
=
PaddleX
::
GenerateColorMap
(
model
.
labels
.
size
());
...
...
deploy/raspberry/demo/segmenter.cpp
浏览文件 @
7a178866
...
...
@@ -25,7 +25,7 @@
DEFINE_string
(
model_dir
,
""
,
"Path of openvino model xml file"
);
DEFINE_string
(
cfg_
dir
,
""
,
"Path of PaddleX model yaml file"
);
DEFINE_string
(
cfg_
file
,
""
,
"Path of PaddleX model yaml file"
);
DEFINE_string
(
image
,
""
,
"Path of test image file"
);
DEFINE_string
(
image_list
,
""
,
"Path of test image list file"
);
DEFINE_string
(
save_dir
,
""
,
"Path to save visualized image"
);
...
...
@@ -38,8 +38,8 @@ int main(int argc, char** argv) {
std
::
cerr
<<
"--model_dir need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_cfg_
dir
==
""
)
{
std
::
cerr
<<
"--cfg_
dir
need to be defined"
<<
std
::
endl
;
if
(
FLAGS_cfg_
file
==
""
)
{
std
::
cerr
<<
"--cfg_
file
need to be defined"
<<
std
::
endl
;
return
-
1
;
}
if
(
FLAGS_image
==
""
&
FLAGS_image_list
==
""
)
{
...
...
@@ -50,7 +50,7 @@ int main(int argc, char** argv) {
//
std
::
cout
<<
"init start"
<<
std
::
endl
;
PaddleX
::
Model
model
;
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
dir
,
FLAGS_thread_num
);
model
.
Init
(
FLAGS_model_dir
,
FLAGS_cfg_
file
,
FLAGS_thread_num
);
std
::
cout
<<
"init done"
<<
std
::
endl
;
int
imgs
=
1
;
auto
colormap
=
PaddleX
::
GenerateColorMap
(
model
.
labels
.
size
());
...
...
deploy/raspberry/include/paddlex/paddlex.h
浏览文件 @
7a178866
...
...
@@ -49,13 +49,13 @@ namespace PaddleX {
class
Model
{
public:
void
Init
(
const
std
::
string
&
model_dir
,
const
std
::
string
&
cfg_
dir
,
const
std
::
string
&
cfg_
file
,
int
thread_num
)
{
create_predictor
(
model_dir
,
cfg_
dir
,
thread_num
);
create_predictor
(
model_dir
,
cfg_
file
,
thread_num
);
}
void
create_predictor
(
const
std
::
string
&
model_dir
,
const
std
::
string
&
cfg_
dir
,
const
std
::
string
&
cfg_
file
,
int
thread_num
);
bool
load_config
(
const
std
::
string
&
model_dir
);
...
...
deploy/raspberry/python/demo.py
浏览文件 @
7a178866
...
...
@@ -27,28 +27,18 @@ def arg_parser():
default
=
None
,
help
=
"path to openvino model .xml file"
)
parser
.
add_argument
(
"--img"
,
"-i"
,
type
=
str
,
default
=
None
,
help
=
"path to an image files"
)
"--img"
,
"-i"
,
type
=
str
,
default
=
None
,
help
=
"path to an image files"
)
parser
.
add_argument
(
"--img_list"
,
"-l"
,
type
=
str
,
default
=
None
,
help
=
"Path to a imglist"
)
"--img_list"
,
"-l"
,
type
=
str
,
default
=
None
,
help
=
"Path to a imglist"
)
parser
.
add_argument
(
"--cfg_
dir
"
,
"--cfg_
file
"
,
"-c"
,
type
=
str
,
default
=
None
,
help
=
"Path to PaddelX model yml file"
)
parser
.
add_argument
(
"--thread_num"
,
"-t"
,
...
...
@@ -63,8 +53,6 @@ def arg_parser():
default
=
None
,
help
=
" image input shape of model [NCHW] like [1,3,224,244] "
)
return
parser
...
...
@@ -72,16 +60,16 @@ def main():
parser
=
arg_parser
()
args
=
parser
.
parse_args
()
model_nb
=
args
.
model_dir
model_yaml
=
args
.
cfg_
dir
model_yaml
=
args
.
cfg_
file
thread_num
=
args
.
thread_num
input_shape
=
args
.
input_shape
input_shape
=
input_shape
[
1
:
-
1
].
split
(
","
,
3
)
shape
=
list
(
map
(
int
,
input_shape
))
input_shape
=
input_shape
[
1
:
-
1
].
split
(
","
,
3
)
shape
=
list
(
map
(
int
,
input_shape
))
#model init
predictor
=
deploy
.
Predictor
(
model_nb
,
model_yaml
,
thread_num
,
shape
)
predictor
=
deploy
.
Predictor
(
model_nb
,
model_yaml
,
thread_num
,
shape
)
#predict
if
(
args
.
img_list
!=
None
):
if
(
args
.
img_list
!=
None
):
f
=
open
(
args
.
img_list
)
lines
=
f
.
readlines
()
for
im_path
in
lines
:
...
...
@@ -92,5 +80,6 @@ def main():
im_path
=
args
.
img
predictor
.
predict
(
im_path
)
if
__name__
==
"__main__"
:
main
()
deploy/raspberry/src/paddlex.cpp
浏览文件 @
7a178866
...
...
@@ -20,19 +20,19 @@
namespace
PaddleX
{
void
Model
::
create_predictor
(
const
std
::
string
&
model_dir
,
const
std
::
string
&
cfg_
dir
,
const
std
::
string
&
cfg_
file
,
int
thread_num
)
{
paddle
::
lite_api
::
MobileConfig
config
;
config
.
set_model_from_file
(
model_dir
);
config
.
set_threads
(
thread_num
);
load_config
(
cfg_
dir
);
load_config
(
cfg_
file
);
predictor_
=
paddle
::
lite_api
::
CreatePaddlePredictor
<
paddle
::
lite_api
::
MobileConfig
>
(
config
);
}
bool
Model
::
load_config
(
const
std
::
string
&
cfg_
dir
)
{
YAML
::
Node
config
=
YAML
::
LoadFile
(
cfg_
dir
);
bool
Model
::
load_config
(
const
std
::
string
&
cfg_
file
)
{
YAML
::
Node
config
=
YAML
::
LoadFile
(
cfg_
file
);
type
=
config
[
"_Attributes"
][
"model_type"
].
as
<
std
::
string
>
();
name
=
config
[
"Model"
].
as
<
std
::
string
>
();
bool
to_rgb
=
true
;
...
...
docs/deploy/openvino/linux.md
浏览文件 @
7a178866
...
...
@@ -7,9 +7,9 @@
*
GCC
*
5.4.0
*
CMake 3.0+
*
PaddleX 1.0+
*
OpenVINO 2020.4
*
OpenVINO 2020.4
*
硬件平台:CPU、VPU
**说明**
:PaddleX安装请参考
[
PaddleX
](
https://paddlex.readthedocs.io/zh_CN/develop/install.html
)
, OpenVINO安装请根据相应的系统参考
[
OpenVINO-Linux
](
https://docs.openvinotoolkit.org/latest/_docs_install_guides_installing_openvino_linux.html
)
或者
[
OpenVINO-Raspbian
](
https://docs.openvinotoolkit.org/latest/openvino_docs_install_guides_installing_openvino_raspbian.html
)
请确保系统已经安装好上述基本软件,并配置好相应环境,
**下面所有示例以工作目录 `/root/projects/`演示**
。
...
...
@@ -35,11 +35,11 @@ git clone https://github.com/PaddlePaddle/PaddleX.git
-
glog:编译请参考
[
编译文档
](
https://github.com/google/glog
)
-
opencv: 编译请参考
-
opencv: 编译请参考
[
编译文档
](
https://docs.opencv.org/master/d7/d9f/tutorial_linux_install.html
)
### Step3: 编译
编译
`cmake`
的命令在
`scripts/build.sh`
中,若在树莓派(Raspbian OS)上编译请修改ARCH参数x86为armv7,若自行编译第三方依赖软件请根据Step1中编译软件的实际情况修改主要参数,其主要内容说明如下:
```
...
...
@@ -71,7 +71,7 @@ ARCH=x86
| --image | 要预测的图片文件路径 |
| --image_list | 按行存储图片路径的.txt文件 |
| --device | 运行的平台,可选项{"CPU","MYRIAD"},默认值为"CPU",如在VPU上请使用"MYRIAD"|
| --cfg_
dir
| PaddleX model 的.yml配置文件 |
| --cfg_
file
| PaddleX model 的.yml配置文件 |
| --save_dir | 可视化结果图片保存地址,仅适用于检测任务,默认值为" "既不保存可视化结果 |
### 样例
...
...
@@ -80,7 +80,7 @@ linux系统在CPU下做单张图片的分类任务预测
测试图片
`/path/to/test_img.jpeg`
```
shell
./build/classifier
--model_dir
=
/path/to/openvino_model
--image
=
/path/to/test_img.jpeg
--cfg_
dir
=
/path/to/PadlleX_model.yml
./build/classifier
--model_dir
=
/path/to/openvino_model
--image
=
/path/to/test_img.jpeg
--cfg_
file
=
/path/to/PadlleX_model.yml
```
...
...
@@ -95,7 +95,7 @@ linux系统在CPU下做多张图片的检测任务预测,并保存预测可视
```
```
shell
./build/detector
--model_dir
=
/path/to/models/openvino_model
--image_list
=
/root/projects/images_list.txt
--cfg_
dir
=
/path/to/PadlleX_model.yml
--save_dir
./output
./build/detector
--model_dir
=
/path/to/models/openvino_model
--image_list
=
/root/projects/images_list.txt
--cfg_
file
=
/path/to/PadlleX_model.yml
--save_dir
./output
```
`样例三`
:
...
...
@@ -103,7 +103,7 @@ linux系统在CPU下做多张图片的检测任务预测,并保存预测可视
测试图片
`/path/to/test_img.jpeg`
```
shell
./build/classifier
--model_dir
=
/path/to/openvino_model
--image
=
/path/to/test_img.jpeg
--cfg_
dir
=
/path/to/PadlleX_model.yml
--device
=
MYRIAD
./build/classifier
--model_dir
=
/path/to/openvino_model
--image
=
/path/to/test_img.jpeg
--cfg_
file
=
/path/to/PadlleX_model.yml
--device
=
MYRIAD
```
## 性能测试
...
...
@@ -118,7 +118,7 @@ linux系统在CPU下做多张图片的检测任务预测,并保存预测可视
|---|---|---|---|
|resnet-50 | 20.56 | 16.12 | 224
*
224 |
|mobilenet-V2 | 5.16 | 2.31 |224
*
224|
|yolov3-mobilnetv1 |76.63| 46.26|608
*
608 |
|yolov3-mobilnetv1 |76.63| 46.26|608
*
608 |
`测试二`
:
在PC机上插入VPU架构的神经计算棒(NCS2),通过Openvino加速。
...
...
@@ -130,7 +130,7 @@ linux系统在CPU下做多张图片的检测任务预测,并保存预测可视
|模型|OpenVINO|输入图片|
|---|---|---|
|mobilenetV2|24.00|224
*
224|
|resnet50_vd_ssld|58.53|224
*
224|
|resnet50_vd_ssld|58.53|224
*
224|
`测试三`
:
在树莓派3B上插入VPU架构的神经计算棒(NCS2),通过Openvino加速。
...
...
docs/deploy/openvino/python.md
浏览文件 @
7a178866
...
...
@@ -19,8 +19,8 @@
| --img | 要预测的图片文件路径 |
| --image_list | 按行存储图片路径的.txt文件 |
| --device | 运行的平台, 默认值为"CPU" |
| --cfg_
dir
| PaddleX model 的.yml配置文件 |
| --cfg_
file
| PaddleX model 的.yml配置文件 |
### 样例
`样例一`
:
测试图片
`/path/to/test_img.jpeg`
...
...
@@ -28,7 +28,7 @@
```
cd /root/projects/python
python demo.py --model_dir /path/to/openvino_model --img /path/to/test_img.jpeg --cfg_
dir
/path/to/PadlleX_model.yml
python demo.py --model_dir /path/to/openvino_model --img /path/to/test_img.jpeg --cfg_
file
/path/to/PadlleX_model.yml
```
样例二
`:
...
...
@@ -45,7 +45,5 @@ python demo.py --model_dir /path/to/openvino_model --img /path/to/test_img.jpeg
```
cd /root/projects/python
python demo.py --model_dir /path/to/models/openvino_model --image_list /root/projects/images_list.txt --cfg_
dir
=/path/to/PadlleX_model.yml
python demo.py --model_dir /path/to/models/openvino_model --image_list /root/projects/images_list.txt --cfg_
file
=/path/to/PadlleX_model.yml
``
`
docs/deploy/openvino/windows.md
浏览文件 @
7a178866
...
...
@@ -81,7 +81,7 @@ cd D:\projects\PaddleX\deploy\openvino\out\build\x64-Release
| --image | 要预测的图片文件路径 |
| --image_list | 按行存储图片路径的.txt文件 |
| --device | 运行的平台,可选项{"CPU","MYRIAD"},默认值为"CPU",如在VPU上请使用"MYRIAD"|
| --cfg_
dir
| PaddleX model 的.yml配置文件 |
| --cfg_
file
| PaddleX model 的.yml配置文件 |
| --save_dir | 可视化结果图片保存地址,仅适用于检测任务,默认值为" "既不保存可视化结果 |
### 样例
...
...
@@ -90,7 +90,7 @@ cd D:\projects\PaddleX\deploy\openvino\out\build\x64-Release
测试图片
`/path/to/test_img.jpeg`
```
shell
./classifier.exe
--model_dir
=
/path/to/openvino_model
--image
=
/path/to/test_img.jpeg
--cfg_
dir
=
/path/to/PadlleX_model.yml
./classifier.exe
--model_dir
=
/path/to/openvino_model
--image
=
/path/to/test_img.jpeg
--cfg_
file
=
/path/to/PadlleX_model.yml
```
`样例二`
:
...
...
@@ -104,7 +104,7 @@ cd D:\projects\PaddleX\deploy\openvino\out\build\x64-Release
```
```
shell
./detector.exe
--model_dir
=
/path/to/models/openvino_model
--image_list
=
/root/projects/images_list.txt
--cfg_
dir
=
/path/to/PadlleX_model.yml
--save_dir
./output
./detector.exe
--model_dir
=
/path/to/models/openvino_model
--image_list
=
/root/projects/images_list.txt
--cfg_
file
=
/path/to/PadlleX_model.yml
--save_dir
./output
```
`样例三`
:
...
...
@@ -112,5 +112,5 @@ cd D:\projects\PaddleX\deploy\openvino\out\build\x64-Release
测试图片
`/path/to/test_img.jpeg`
```
shell
.classifier.exe
--model_dir
=
/path/to/openvino_model
--image
=
/path/to/test_img.jpeg
--cfg_dir
=
/path/to/PadlleX_model.yml
--device
=
MYRIAD
```
\ No newline at end of file
.classifier.exe
--model_dir
=
/path/to/openvino_model
--image
=
/path/to/test_img.jpeg
--cfg_file
=
/path/to/PadlleX_model.yml
--device
=
MYRIAD
```
docs/deploy/raspberry/Raspberry.md
浏览文件 @
7a178866
...
...
@@ -4,7 +4,7 @@ PaddleX支持通过Paddle-Lite和基于OpenVINO的神经计算棒(NCS2)这两种
## 硬件环境配置
对于尚未安装系统的树莓派首先需要进行系统安装、环境配置等步骤来初始化硬件环境,过程中需要的软硬件如下:
对于尚未安装系统的树莓派首先需要进行系统安装、环境配置等步骤来初始化硬件环境,过程中需要的软硬件如下:
-
硬件:micro SD,显示器,键盘,鼠标
-
软件:Raspbian OS
...
...
@@ -23,7 +23,7 @@ sudo apt-get upgrade
```
## Paddle-Lite部署
基于Paddle-Lite的部署目前可以支持PaddleX的分类、分割与检测模型,其实检测模型仅支持YOLOV3
基于Paddle-Lite的部署目前可以支持PaddleX的分类、分割与检测模型,其实检测模型仅支持YOLOV3
部署的流程包括:PaddleX模型转换与转换后的模型部署
**说明**
:PaddleX安装请参考
[
PaddleX
](
https://paddlex.readthedocs.io/zh_CN/develop/install.html
)
,Paddle-Lite详细资料请参考
[
Paddle-Lite
](
https://paddle-lite.readthedocs.io/zh/latest/index.html
)
...
...
@@ -62,11 +62,11 @@ sudo ./lite/tools/build.sh --arm_os=armlinux --arm_abi=armv7hf --arm_lang=gcc
-
glog:编译请参考
[
编译文档
](
https://github.com/google/glog
)
-
opencv: 编译请参考
-
opencv: 编译请参考
[
编译文档
](
https://docs.opencv.org/master/d7/d9f/tutorial_linux_install.html
)
### Step4: 编译
编译
`cmake`
的命令在
`scripts/build.sh`
中,修改LITE_DIR为Paddle-Lite预测库目录,若自行编译第三方依赖软件请根据Step1中编译软件的实际情况修改主要参数,其主要内容说明如下:
```
编译
`cmake`
的命令在
`scripts/build.sh`
中,修改LITE_DIR为Paddle-Lite预测库目录,若自行编译第三方依赖软件请根据Step1中编译软件的实际情况修改主要参数,其主要内容说明如下:
```
# Paddle-Lite预编译库的路径
LITE_DIR=/path/to/Paddle-Lite/inference/lib
# gflags预编译库的路径
...
...
@@ -91,7 +91,7 @@ OPENCV_DIR=$(pwd)/deps/opencv/
| --image | 要预测的图片文件路径 |
| --image_list | 按行存储图片路径的.txt文件 |
| --thread_num | 预测的线程数,默认值为1 |
| --cfg_
dir
| PaddleX model 的.yml配置文件 |
| --cfg_
file
| PaddleX model 的.yml配置文件 |
| --save_dir | 可视化结果图片保存地址,仅适用于检测和分割任务,默认值为" "既不保存可视化结果 |
### 样例
...
...
@@ -100,8 +100,8 @@ OPENCV_DIR=$(pwd)/deps/opencv/
测试图片
`/path/to/test_img.jpeg`
```
shell
./build/classifier
--model_dir
=
/path/to/nb_model
--image
=
/path/to/test_img.jpeg
--cfg_
dir
=
/path/to/PadlleX_model.yml
--thread_num
=
4
./build/classifier
--model_dir
=
/path/to/nb_model
--image
=
/path/to/test_img.jpeg
--cfg_
file
=
/path/to/PadlleX_model.yml
--thread_num
=
4
```
...
...
@@ -116,7 +116,7 @@ OPENCV_DIR=$(pwd)/deps/opencv/
```
```
shell
./build/segmenter
--model_dir
=
/path/to/models/nb_model
--image_list
=
/root/projects/images_list.txt
--cfg_
dir
=
/path/to/PadlleX_model.yml
--save_dir
./output
--thread_num
=
4
./build/segmenter
--model_dir
=
/path/to/models/nb_model
--image_list
=
/root/projects/images_list.txt
--cfg_
file
=
/path/to/PadlleX_model.yml
--save_dir
./output
--thread_num
=
4
```
## 性能测试
...
...
@@ -153,4 +153,4 @@ OPENCV_DIR=$(pwd)/deps/opencv/
## NCS2部署
树莓派支持通过OpenVINO在NCS2上跑PaddleX模型预测,目前仅支持PaddleX的分类网络,基于NCS2的方式包含Paddle模型转OpenVINO IR以及部署IR在NCS2上进行预测两个步骤。
-
模型转换请参考:
[
PaddleX模型转换为OpenVINO IR
](
'./openvino/export_openvino_model.md'
)
,raspbian OS上的OpenVINO不支持模型转换,需要先在host侧转换FP16的IR。
-
预测部署请参考
[
OpenVINO部署
](
./openvino/linux.md
)
中VPU在raspbian OS部署的部分
\ No newline at end of file
-
预测部署请参考
[
OpenVINO部署
](
./openvino/linux.md
)
中VPU在raspbian OS部署的部分
docs/deploy/raspberry/python.md
浏览文件 @
7a178866
...
...
@@ -22,10 +22,10 @@ python -m pip install paddlelite
| --model_dir | 模型转换生成的.xml文件路径,请保证模型转换生成的三个文件在同一路径下|
| --img | 要预测的图片文件路径 |
| --image_list | 按行存储图片路径的.txt文件 |
| --cfg_
dir
| PaddleX model 的.yml配置文件 |
| --cfg_
file
| PaddleX model 的.yml配置文件 |
| --thread_num | 预测的线程数, 默认值为1 |
| --input_shape | 模型输入中图片输入的大小[N,C,H.W] |
### 样例
`样例一`
:
测试图片
`/path/to/test_img.jpeg`
...
...
@@ -33,7 +33,7 @@ python -m pip install paddlelite
```
cd /root/projects/python
python demo.py --model_dir /path/to/openvino_model --img /path/to/test_img.jpeg --cfg_
dir
/path/to/PadlleX_model.yml --thread_num 4 --input_shape [1,3,224,224]
python demo.py --model_dir /path/to/openvino_model --img /path/to/test_img.jpeg --cfg_
file
/path/to/PadlleX_model.yml --thread_num 4 --input_shape [1,3,224,224]
```
样例二
`:
...
...
@@ -50,5 +50,5 @@ python demo.py --model_dir /path/to/openvino_model --img /path/to/test_img.jpeg
```
cd /root/projects/python
python demo.py --model_dir /path/to/models/openvino_model --image_list /root/projects/images_list.txt --cfg_dir=/path/to/PadlleX_model.yml --thread_num 4 --input_shape [1,3,224,224]
``
`
\ No newline at end of file
python demo.py --model_dir /path/to/models/openvino_model --image_list /root/projects/images_list.txt --cfg_file=/path/to/PadlleX_model.yml --thread_num 4 --input_shape [1,3,224,224]
``
`
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录