diff --git a/lite/tutorials/source_en/quick_start/quick_start.md b/lite/tutorials/source_en/quick_start/quick_start.md index 4eb93b75eddbf82cbb65cf9b280f840b0453c51a..1e1c470ffdb9453ddf717c81b0a0ce957f5b8caf 100644 --- a/lite/tutorials/source_en/quick_start/quick_start.md +++ b/lite/tutorials/source_en/quick_start/quick_start.md @@ -39,7 +39,7 @@ In addition, you can use the preset model to perform migration learning to imple ## Converting a Model -After you retrain a model provided by MindSpore, export the model in the [.mindir format](https://www.mindspore.cn/tutorial/en/r0.7/use/saving_and_loading_model_parameters.html#mindir). Use the MindSpore Lite [model conversion tool](https://www.mindspore.cn/lite/tutorial/en/r0.7/use/converter_tool.html) to convert the .mindir model to a .ms model. +After you retrain a model provided by MindSpore, export the model in the [.mindir format](https://www.mindspore.cn/tutorial/en/r0.7/use/saving_and_loading_model_parameters.html#export-mindir-model). Use the MindSpore Lite [model conversion tool](https://www.mindspore.cn/lite/tutorial/en/r0.7/use/converter_tool.html) to convert the .mindir model to a .ms model. Take the mobilenetv2 model as an example. Execute the following script to convert a model into a MindSpore Lite model for on-device inference. ```bash