提交 518e11f9 编写于 作者: M mindspore-ci-bot 提交者: Gitee

!566 Update images

Merge pull request !566 from JunYuLiu/r0.6
......@@ -17,7 +17,7 @@
MindSpore Lite is a lightweight deep neural network inference engine that provides the inference function for models trained by MindSpore on the device side. This tutorial describes how to use and compile MindSpore Lite.
![](./images/on_device_inference_frame.png)
![](./images/on_device_inference_frame.jpg)
Figure 1 On-device inference frame diagram
......@@ -173,7 +173,7 @@ To perform on-device model inference using MindSpore, perform the following step
Use the `.ms` model file and image data as input to create a session and implement inference on the device.
![](./images/side_infer_process.png)
![](./images/side_infer_process.jpg)
1. Load the `.ms` model file to the memory buffer. The ReadFile function needs to be implemented by users, according to the [C++ tutorial](http://www.cplusplus.com/doc/tutorial/files/).
```cpp
......
......@@ -17,7 +17,7 @@
MindSpore Lite是一个轻量级的深度神经网络推理引擎,提供了将MindSpore训练出的模型在端侧进行推理的功能。本教程介绍MindSpore Lite的编译方法和使用指南。
![](./images/on_device_inference_frame.png)
![](./images/on_device_inference_frame.jpg)
图1:端侧推理架构图
......@@ -173,7 +173,7 @@ MindSpore进行端侧模型推理的步骤如下。
`.ms`模型文件和图片数据作为输入,创建`session`在端侧实现推理。
![](./images/side_infer_process.png)
![](./images/side_infer_process.jpg)
图2:端侧推理时序图
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册