diff --git a/tutorials/source_en/advanced_use/on_device_inference.md b/tutorials/source_en/advanced_use/on_device_inference.md index f8c130951834fec999103a264e067d81bc2c06dd..c69e6e3e7bd8506a6f790b6fbb1612fa81167117 100644 --- a/tutorials/source_en/advanced_use/on_device_inference.md +++ b/tutorials/source_en/advanced_use/on_device_inference.md @@ -78,6 +78,8 @@ The compilation procedure is as follows: ## Use of On-Device Inference +> During optimization and upgrade, temporarily unavailable. + When MindSpore is used to perform model inference in the APK project of an app, preprocessing input is required before model inference. For example, before an image is converted into the tensor format required by MindSpore inference, the image needs to be resized. After MindSpore completes model inference, postprocess the model inference result and sends the processed output to the app. This section describes how to use MindSpore to perform model inference. The setup of an APK project and pre- and post-processing of model inference are not described here. diff --git a/tutorials/source_zh_cn/advanced_use/on_device_inference.md b/tutorials/source_zh_cn/advanced_use/on_device_inference.md index 7de3f8f563cd4c2eee1e6048217590ae866d9de6..02155ee2c799a69c2d2cd01517ff8ad82940b042 100644 --- a/tutorials/source_zh_cn/advanced_use/on_device_inference.md +++ b/tutorials/source_zh_cn/advanced_use/on_device_inference.md @@ -77,6 +77,9 @@ MindSpore Predict是一个轻量级的深度神经网络推理引擎,提供了 ## 端侧推理使用 +> 优化升级中,暂不可用。 + + 在APP的APK工程中使用MindSpore进行模型推理前,需要对输入进行必要的前处理,比如将图片转换成MindSpore推理要求的`tensor`格式、对图片进行`resize`等处理。在MindSpore完成模型推理后,对模型推理的结果进行后处理,并将处理的输出发送给APP应用。 本章主要描述用户如何使用MindSpore进行模型推理,APK工程的搭建和模型推理的前后处理,不在此列举。