diff --git a/tutorials/source_en/use/multi_platform_inference.md b/tutorials/source_en/use/multi_platform_inference.md index edc6b07defbee85c638e5e6c189ff75ab1cb0f6b..16b0e99c4fcf9d7ff8c5d7671b16a84761e2acf8 100644 --- a/tutorials/source_en/use/multi_platform_inference.md +++ b/tutorials/source_en/use/multi_platform_inference.md @@ -100,6 +100,12 @@ MindSpore supports the following inference scenarios based on the hardware platf In the preceding information: `hub.load_weights` is an API for loading model parameters. PLease check the details in . + + if want to use `hub`, need to isntall the requirement package `bs4`. The install as following: + + ```python + pip install bs4 + ``` 2. Use the `model.predict` API to perform inference. ```python diff --git a/tutorials/source_zh_cn/use/multi_platform_inference.md b/tutorials/source_zh_cn/use/multi_platform_inference.md index 0645e3517e7f0174062ab06e830bc1ef518c69fb..6c54bcb0c9450a194b0365424fe50a7ab29ef383 100644 --- a/tutorials/source_zh_cn/use/multi_platform_inference.md +++ b/tutorials/source_zh_cn/use/multi_platform_inference.md @@ -98,6 +98,10 @@ CPU | ONNX格式 | 支持ONNX推理的runtime/SDK,如TensorRT。 ``` 其中, `hub.load_weights`为加载模型参数接口,对应接口说明:。 + 使用hub功能接口之前,需要安装bs4依赖包,具体安装代码为: + ```python + pip install bs4 + ``` 2. 使用`model.predict`接口来进行推理操作。 ```python