提交 cd8932d8 编写于 作者: M mindspore-ci-bot 提交者: Gitee

!345 update export lite model for predict

Merge pull request !345 from yangjie159/r0.5
......@@ -78,6 +78,8 @@ The compilation procedure is as follows:
## Use of On-Device Inference
> During optimization and upgrade, temporarily unavailable.
When MindSpore is used to perform model inference in the APK project of an app, preprocessing input is required before model inference. For example, before an image is converted into the tensor format required by MindSpore inference, the image needs to be resized. After MindSpore completes model inference, postprocess the model inference result and sends the processed output to the app.
This section describes how to use MindSpore to perform model inference. The setup of an APK project and pre- and post-processing of model inference are not described here.
......
......@@ -77,6 +77,9 @@ MindSpore Predict是一个轻量级的深度神经网络推理引擎,提供了
## 端侧推理使用
> 优化升级中,暂不可用。
在APP的APK工程中使用MindSpore进行模型推理前,需要对输入进行必要的前处理,比如将图片转换成MindSpore推理要求的`tensor`格式、对图片进行`resize`等处理。在MindSpore完成模型推理后,对模型推理的结果进行后处理,并将处理的输出发送给APP应用。
本章主要描述用户如何使用MindSpore进行模型推理,APK工程的搭建和模型推理的前后处理,不在此列举。
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册