Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
PaddleClas
提交
97f4f557
P
PaddleClas
项目概览
PaddlePaddle
/
PaddleClas
大约 1 年 前同步成功
通知
115
Star
4999
Fork
1114
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
19
列表
看板
标记
里程碑
合并请求
6
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleClas
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
19
Issue
19
列表
看板
标记
里程碑
合并请求
6
合并请求
6
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
未验证
提交
97f4f557
编写于
4月 09, 2021
作者:
T
Tingquan Gao
提交者:
GitHub
4月 09, 2021
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Fix hyperlink and support arm7 (#670)
* Update hyperlink, test=document_fix * Fix to support arm7
上级
a6e2114e
变更
3
隐藏空白更改
内联
并排
Showing
3 changed file
with
37 addition
and
35 deletion
+37
-35
deploy/lite/Makefile
deploy/lite/Makefile
+21
-14
deploy/lite/readme.md
deploy/lite/readme.md
+8
-11
deploy/lite/readme_en.md
deploy/lite/readme_en.md
+8
-10
未找到文件。
deploy/lite/Makefile
浏览文件 @
97f4f557
...
...
@@ -9,20 +9,27 @@ THIRD_PARTY_DIR=${LITE_ROOT}/third_party
OPENCV_VERSION
=
opencv4.1.0
OPENCV_LIBS
=
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/libs/libopencv_imgcodecs.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/libs/libopencv_imgproc.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/libs/libopencv_core.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/3rdparty/libs/libtegra_hal.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/3rdparty/libs/liblibjpeg-turbo.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/3rdparty/libs/liblibwebp.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/3rdparty/libs/liblibpng.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/3rdparty/libs/liblibjasper.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/3rdparty/libs/liblibtiff.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/3rdparty/libs/libIlmImf.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/3rdparty/libs/libtbb.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/arm64-v8a/3rdparty/libs/libcpufeatures.a
OPENCV_INCLUDE
=
-I
../../../third_party/
${OPENCV_VERSION}
/arm64-v8a/include
ifeq
(${ARM_ABI}, arm8)
ARM_PATH
=
arm64-v8a
endif
ifeq
(${ARM_ABI}, arm7)
ARM_PATH
=
armeabi-v7a
endif
OPENCV_LIBS
=
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/libs/libopencv_imgcodecs.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/libs/libopencv_imgproc.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/libs/libopencv_core.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/3rdparty/libs/libtegra_hal.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/3rdparty/libs/liblibjpeg-turbo.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/3rdparty/libs/liblibwebp.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/3rdparty/libs/liblibpng.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/3rdparty/libs/liblibjasper.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/3rdparty/libs/liblibtiff.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/3rdparty/libs/libIlmImf.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/3rdparty/libs/libtbb.a
\
${THIRD_PARTY_DIR}
/
${OPENCV_VERSION}
/
${ARM_PATH}
/3rdparty/libs/libcpufeatures.a
OPENCV_INCLUDE
=
-I
../../../third_party/
${OPENCV_VERSION}
/
${ARM_PATH}
/include
CXX_INCLUDES
=
$(INCLUDES)
${OPENCV_INCLUDE}
-I
$(LITE_ROOT)
/cxx/include
...
...
deploy/lite/readme.md
浏览文件 @
97f4f557
...
...
@@ -25,8 +25,8 @@ Paddle Lite是飞桨轻量化推理引擎,为手机、IOT端提供高效推理
1.
[建议]直接下载,预测库下载链接如下:
|平台|预测库下载链接|
|-|-|
|Android|
[
arm7
](
https://paddlelite-data.bj.bcebos.com/Release/2.
6.1/Android/inference_lite_lib.android.armv7.gcc.c++_static.with_extra.CV_ON.tar.gz
)
/
[
arm8
](
https://paddlelite-data.bj.bcebos.com/Release/2.6.1/Android/inference_lite_lib.android.armv8.gcc.c++_static.with_extra.CV_ON
.tar.gz
)
|
|iOS|
[
arm7
](
https://paddlelite-data.bj.bcebos.com/Release/2.
6.1/iOS/inference_lite_lib.ios.armv7.with_extra.CV_ON.tar.gz
)
/
[
arm8
](
https://paddlelite-data.bj.bcebos.com/Release/2.6.1/iOS/inference_lite_lib.ios64.armv8.with_extra.CV_ON
.tar.gz
)
|
|Android|
[
arm7
](
https://paddlelite-data.bj.bcebos.com/Release/2.
8-rc/Android/gcc/inference_lite_lib.android.armv7.gcc.c++_static.with_extra.with_cv.tar.gz
)
/
[
arm8
](
https://paddlelite-data.bj.bcebos.com/Release/2.8-rc/Android/gcc/inference_lite_lib.android.armv8.gcc.c++_static.with_extra.with_cv
.tar.gz
)
|
|iOS|
[
arm7
](
https://paddlelite-data.bj.bcebos.com/Release/2.
8-rc/iOS/inference_lite_lib.ios.armv7.with_cv.with_extra.tiny_publish.tar.gz
)
/
[
arm8
](
https://paddlelite-data.bj.bcebos.com/Release/2.8-rc/iOS/inference_lite_lib.ios.armv8.with_cv.with_extra.tiny_publish
.tar.gz
)
|
**注**:
1. 如果是从 Paddle-Lite [官方文档](https://paddle-lite.readthedocs.io/zh/latest/quick_start/release_lib.html#android-toolchain-gcc)下载的预测库,
...
...
@@ -83,9 +83,10 @@ Paddle-Lite 提供了多种策略来自动优化原始的模型,其中包括
#### 2.1.1 [建议]pip安装paddlelite并进行转换
Python下安装
`paddlelite`
,目前最高支持
`Python3.7`
。
**注意**
:
`paddlelite`
whl包版本必须和预测库版本对应。
```
shell
pip
install
paddlelite
pip
install
paddlelite
==
2.8
```
之后使用
`paddle_lite_opt`
工具可以进行inference模型的转换。
`paddle_lite_opt`
的部分参数如下
...
...
@@ -132,17 +133,13 @@ cd build.opt/lite/api/
```
shell
# 进入PaddleClas根目录
cd
PaddleClas_root_path
export
PYTHONPATH
=
$PWD
# 下载并解压预训练模型
wget https://paddle-imagenet-models-name.bj.bcebos.com/MobileNetV3_large_x1_0_pretrained.tar
tar
-xf
MobileNetV3_large_x1_0_pretrained.tar
# 将预训练模型导出为inference模型
python tools/export_model.py
-m
MobileNetV3_large_x1_0
-p
./MobileNetV3_large_x1_0_pretrained/
-o
./MobileNetV3_large_x1_0_inference/
# 下载并解压inference模型
wget https://paddle-imagenet-models-name.bj.bcebos.com/dygraph/inference/MobileNetV3_large_x1_0_infer.tar
tar
-xf
MobileNetV3_large_x1_0_infer.tar
# 将inference模型转化为Paddle-Lite优化模型
paddle_lite_opt
--model_file
=
./MobileNetV3_large_x1_0_infer
ence/model
--param_file
=
./MobileNetV3_large_x1_0_inference/
params
--optimize_out
=
./MobileNetV3_large_x1_0
paddle_lite_opt
--model_file
=
./MobileNetV3_large_x1_0_infer
/inference.pdmodel
--param_file
=
./MobileNetV3_large_x1_0_infer/inference.pdi
params
--optimize_out
=
./MobileNetV3_large_x1_0
```
最终在当前文件夹下生成
`MobileNetV3_large_x1_0.nb`
的文件。
...
...
deploy/lite/readme_en.md
浏览文件 @
97f4f557
...
...
@@ -26,8 +26,8 @@ For the detailed compilation directions of different development environments, p
|Platform|Inference Library Download Link|
|-|-|
|Android|
[
arm7
](
https://paddlelite-data.bj.bcebos.com/Release/2.
6.1/Android/inference_lite_lib.android.armv7.gcc.c++_static.with_extra.CV_ON.tar.gz
)
/
[
arm8
](
https://paddlelite-data.bj.bcebos.com/Release/2.6.1/Android/inference_lite_lib.android.armv8.gcc.c++_static.with_extra.CV_ON
.tar.gz
)
|
|iOS|
[
arm7
](
https://paddlelite-data.bj.bcebos.com/Release/2.
6.1/iOS/inference_lite_lib.ios.armv7.with_extra.CV_ON.tar.gz
)
/
[
arm8
](
https://paddlelite-data.bj.bcebos.com/Release/2.6.1/iOS/inference_lite_lib.ios64.armv8.with_extra.CV_ON
.tar.gz
)
|
|Android|
[
arm7
](
https://paddlelite-data.bj.bcebos.com/Release/2.
8-rc/Android/gcc/inference_lite_lib.android.armv7.gcc.c++_static.with_extra.with_cv.tar.gz
)
/
[
arm8
](
https://paddlelite-data.bj.bcebos.com/Release/2.8-rc/Android/gcc/inference_lite_lib.android.armv8.gcc.c++_static.with_extra.with_cv
.tar.gz
)
|
|iOS|
[
arm7
](
https://paddlelite-data.bj.bcebos.com/Release/2.
8-rc/iOS/inference_lite_lib.ios.armv7.with_cv.with_extra.tiny_publish.tar.gz
)
/
[
arm8
](
https://paddlelite-data.bj.bcebos.com/Release/2.8-rc/iOS/inference_lite_lib.ios.armv8.with_cv.with_extra.tiny_publish
.tar.gz
)
|
**NOTE**
:
...
...
@@ -78,8 +78,9 @@ Paddle-Lite provides a variety of strategies to automatically optimize the origi
*
Use pip to install Paddle-Lite. The following command uses
`pip3.7`
.
```
shell
pip
install
paddlelite
pip
install
paddlelite
==
2.8
```
**Note**
:The version of
`paddlelite`
's wheel must match that of inference lib.
*
Use
`paddle_lite_opt`
to optimize inference model, the parameters of
`paddle_lite_opt`
are as follows:
...
...
@@ -121,17 +122,14 @@ Taking the `MobileNetV3_large_x1_0` model of PaddleClas as an example, we will i
```
shell
# enter PaddleClas root directory
cd
PaddleClas_root_path
export
PYTHONPATH
=
$PWD
# download and uncompress the
pre-trained
model
wget https://paddle-imagenet-models-name.bj.bcebos.com/
MobileNetV3_large_x1_0_pretrained
.tar
tar
-xf
MobileNetV3_large_x1_0_
pretrained
.tar
# download and uncompress the
inference
model
wget https://paddle-imagenet-models-name.bj.bcebos.com/
dygraph/inference/MobileNetV3_large_x1_0_infer
.tar
tar
-xf
MobileNetV3_large_x1_0_
infer
.tar
# export the pre-trained model as an inference model
python tools/export_model.py
-m
MobileNetV3_large_x1_0
-p
./MobileNetV3_large_x1_0_pretrained/
-o
./MobileNetV3_large_x1_0_inference/
# convert inference model to Paddle-Lite optimized model
paddle_lite_opt
--model_file
=
./MobileNetV3_large_x1_0_infer
ence/model
--param_file
=
./MobileNetV3_large_x1_0_inference/
params
--optimize_out
=
./MobileNetV3_large_x1_0
paddle_lite_opt
--model_file
=
./MobileNetV3_large_x1_0_infer
/inference.pdmodel
--param_file
=
./MobileNetV3_large_x1_0_infer/inference.pdi
params
--optimize_out
=
./MobileNetV3_large_x1_0
```
When the above code command is completed, there will be
``MobileNetV3_large_x1_0.nb` in the current directory, which is the converted model file.
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录