Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
MindSpore
docs
提交
30db5505
D
docs
项目概览
MindSpore
/
docs
通知
5
Star
3
Fork
2
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
D
docs
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
30db5505
编写于
7月 30, 2020
作者:
M
mindspore-ci-bot
提交者:
Gitee
7月 30, 2020
浏览文件
操作
浏览文件
下载
差异文件
!526 fix on_device inference md
Merge pull request !526 from yankai10/merge
上级
bd753463
7caf3f53
变更
2
显示空白变更内容
内联
并排
Showing
2 changed file
with
137 addition
and
120 deletion
+137
-120
tutorials/source_en/advanced_use/on_device_inference.md
tutorials/source_en/advanced_use/on_device_inference.md
+68
-60
tutorials/source_zh_cn/advanced_use/on_device_inference.md
tutorials/source_zh_cn/advanced_use/on_device_inference.md
+69
-60
未找到文件。
tutorials/source_en/advanced_use/on_device_inference.md
浏览文件 @
30db5505
...
...
@@ -98,22 +98,21 @@ To perform on-device model inference using MindSpore, perform the following step
param_dict
=
load_checkpoint
(
ckpt_file_name
=
ckpt_file_path
)
load_param_into_net
(
net
,
param_dict
)
```
2.
Call the
`export`
API to export the
`.
ms
`
model file on the device.
2.
Call the
`export`
API to export the
`.
pb
`
model file on the device.
```
python
export
(
net
,
input_data
,
file_name
=
"./lenet.
ms
"
,
file_format
=
'BINARY'
)
export
(
net
,
input_data
,
file_name
=
"./lenet.
pb
"
,
file_format
=
'BINARY'
)
```
Take the LeNet network as an example. The generated on-device model file is
`lenet.ms`
. The complete sample code
`lenet.py`
is as follows:
```
python
import
os
import
numpy
as
np
import
mindspore.nn
as
nn
import
mindspore.ops.operations
as
P
import
mindspore.context
as
context
from
mindspore.common.tensor
import
Tensor
from
mindspore.train.serialization
import
export
,
load_checkpoint
,
load_param_into_net
class
LeNet
(
nn
.
Cell
):
Take the LeNet network as an example. The generated on-device model file is
`lenet.pb`
. The complete sample code
`lenet.py`
is as follows:
```
python
import
os
import
numpy
as
np
import
mindspore.nn
as
nn
import
mindspore.ops.operations
as
P
import
mindspore.context
as
context
from
mindspore.common.tensor
import
Tensor
from
mindspore.train.serialization
import
export
,
load_checkpoint
,
load_param_into_net
class
LeNet
(
nn
.
Cell
):
def
__init__
(
self
):
super
(
LeNet
,
self
).
__init__
()
self
.
relu
=
P
.
ReLU
()
...
...
@@ -141,7 +140,7 @@ class LeNet(nn.Cell):
output
=
self
.
fc3
(
output
)
return
output
if
__name__
==
'__main__'
:
if
__name__
==
'__main__'
:
context
.
set_context
(
mode
=
context
.
GRAPH_MODE
,
device_target
=
"Ascend"
)
seed
=
0
np
.
random
.
seed
(
seed
)
...
...
@@ -155,11 +154,20 @@ if __name__ == '__main__':
if
is_ckpt_exist
:
param_dict
=
load_checkpoint
(
ckpt_file_name
=
ckpt_file_path
)
load_param_into_net
(
net
,
param_dict
)
export
(
net
,
input_data
,
file_name
=
"./lenet.ms
"
,
file_format
=
'BINARY'
)
export
(
net
,
input_data
,
file_name
=
"./lenet.pb
"
,
file_format
=
'BINARY'
)
print
(
"export model success."
)
else
:
print
(
"checkpoint file does not exist."
)
```
```
3.
Calling MindSpore convert tool named
`converter_lite`
, convert model file (
`.pb`
) to on_device inference model file (
`.ms`
).
```
./converter_lite --fmk=MS --modelFile=./lenet.pb --outputFile=lenet
```
Result:
```
INFO [converter/converter.cc:146] Runconverter] CONVERTER RESULT: SUCCESS!
```
This means that the model has been successfully converted to the mindspore on_device inference model.
### Implementing On-Device Inference
...
...
tutorials/source_zh_cn/advanced_use/on_device_inference.md
浏览文件 @
30db5505
...
...
@@ -97,22 +97,22 @@ MindSpore进行端侧模型推理的步骤如下。
param_dict
=
load_checkpoint
(
ckpt_file_name
=
ckpt_file_path
)
load_param_into_net
(
net
,
param_dict
)
```
2.
调用
`export`
接口,导出
端侧模型文件(
`.ms
`
)。
2.
调用
`export`
接口,导出
模型文件(
`.pb
`
)。
```
python
export
(
net
,
input_data
,
file_name
=
"./lenet.
ms
"
,
file_format
=
'BINARY'
)
export
(
net
,
input_data
,
file_name
=
"./lenet.
pb
"
,
file_format
=
'BINARY'
)
```
以LeNet网络为例,生成的端侧模型文件为
`lenet.ms
`
,完整示例代码
`lenet.py`
如下。
```
python
import
os
import
numpy
as
np
import
mindspore.nn
as
nn
import
mindspore.ops.operations
as
P
import
mindspore.context
as
context
from
mindspore.common.tensor
import
Tensor
from
mindspore.train.serialization
import
export
,
load_checkpoint
,
load_param_into_net
class
LeNet
(
nn
.
Cell
):
以LeNet网络为例,生成的端侧模型文件为`lenet.pb
`,完整示例代码`lenet.py`如下。
```python
import os
import numpy as np
import mindspore.nn as nn
import mindspore.ops.operations as P
import mindspore.context as context
from mindspore.common.tensor import Tensor
from mindspore.train.serialization import export, load_checkpoint, load_param_into_net
class LeNet(nn.Cell):
def __init__(self):
super(LeNet, self).__init__()
self.relu = P.ReLU()
...
...
@@ -140,7 +140,7 @@ class LeNet(nn.Cell):
output = self.fc3(output)
return output
if
__name__
==
'__main__'
:
if __name__ == '__main__':
context.set_context(mode=context.GRAPH_MODE, device_target="Ascend")
seed = 0
np.random.seed(seed)
...
...
@@ -154,11 +154,20 @@ if __name__ == '__main__':
if is_ckpt_exist:
param_dict = load_checkpoint(ckpt_file_name=ckpt_file_path)
load_param_into_net(net, param_dict)
export
(
net
,
input_data
,
file_name
=
"./lenet.ms
"
,
file_format
=
'BINARY'
)
export(net, input_data, file_name="./lenet.pb
", file_format='BINARY')
print("export model success.")
else:
print("checkpoint file does not exist.")
```
```
3.
调用MindSpore端侧转化工具
`converter_lite`
工具,将模型文件(
`.pb`
)转换为端侧模型文件(
`.ms`
)。
```
./converter_lite --fmk=MS --modelFile=./lenet.pb --outputFile=lenet
```
结果显示为:
```
INFO [converter/converter.cc:146] Runconverter] CONVERTER RESULT: SUCCESS!
```
这表示已经成功将模型转化为MindSpore端侧模型。
### 在端侧实现推理
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录