-[Inference Using a Checkpoint File](#inference-using-a-checkpoint-file-2)
-[Inference Using an ONNX File](#inference-using-an-onnx-file-1)
-[On-Device Inference](#on-device-inference)
<!-- /TOC -->
...
...
@@ -150,7 +149,3 @@ Similar to the inference on a GPU, the following steps are required:
2. Perform inference on a CPU by referring to the runtime or SDK document. For details about how to use the ONNX Runtime, see the [ONNX Runtime document](https://github.com/microsoft/onnxruntime).
## On-Device Inference
MindSpore Predict is an inference engine for on-device inference. For details, see [On-Device Inference](https://www.mindspore.cn/tutorial/en/r0.6/advanced_use/on_device_inference.html).