提交 b1e4b4f1 编写于 作者: G gaotingquan 提交者: Tingquan Gao

docs: update demo result

上级 ac986474
...@@ -148,11 +148,18 @@ Predict complete! ...@@ -148,11 +148,18 @@ Predict complete!
```python ```python
from paddleclas import PaddleClas from paddleclas import PaddleClas
clas = PaddleClas(model_name='PPHGNet_small') clas = PaddleClas(model_name='PPHGNet_small')
infer_imgs = 'docs/images/deployment/whl_demo.jpg' infer_imgs = 'docs/images/inference_deployment/whl_demo.jpg'
result = clas.predict(infer_imgs) result = clas.predict(infer_imgs)
print(next(result)) print(next(result))
``` ```
The result of demo above:
```
>>> result
[{'class_ids': [8, 7, 86, 82, 81], 'scores': [0.77132, 0.05122, 0.00755, 0.00199, 0.00115], 'label_names': ['hen', 'cock', 'partridge', 'ruffed grouse, partridge, Bonasa umbellus', 'ptarmigan'], 'filename': 'docs/images/inference_deployment/whl_demo.jpg'}]
```
**Note**: The result returned by model.predict() is a `generator`, so you need to use the `next()` function to call it or `for loop` to loop it. And it will predict with batch_size size batch and return the prediction results when called. The default batch_size is 1, and you also specify the batch_size when instantiating, such as `model = paddleclas.PaddleClas(model_name="PPHGNet_small", batch_size=2)`. **Note**: The result returned by model.predict() is a `generator`, so you need to use the `next()` function to call it or `for loop` to loop it. And it will predict with batch_size size batch and return the prediction results when called. The default batch_size is 1, and you also specify the batch_size when instantiating, such as `model = paddleclas.PaddleClas(model_name="PPHGNet_small", batch_size=2)`.
<a name="3"></a> <a name="3"></a>
......
...@@ -198,11 +198,18 @@ Predict complete! ...@@ -198,11 +198,18 @@ Predict complete!
```python ```python
from paddleclas import PaddleClas from paddleclas import PaddleClas
clas = PaddleClas(model_name='PPLCNet_x1_0') clas = PaddleClas(model_name='PPLCNet_x1_0')
infer_imgs = 'docs/images/deployment/whl_demo.jpg' infer_imgs = 'docs/images/inference_deployment/whl_demo.jpg'
result = clas.predict(infer_imgs) result = clas.predict(infer_imgs)
print(next(result)) print(next(result))
``` ```
The result of demo above:
```
>>> result
[{'class_ids': [8, 7, 86, 81, 85], 'scores': [0.91347, 0.03779, 0.0036, 0.00117, 0.00112], 'label_names': ['hen', 'cock', 'partridge', 'ptarmigan', 'quail'], 'filename': 'docs/images/inference_deployment/whl_demo.jpg'}]
```
**Note**: The result returned by model.predict() is a `generator`, so you need to use the `next()` function to call it or `for loop` to loop it. And it will predict with batch_size size batch and return the prediction results when called. The default batch_size is 1, and you also specify the batch_size when instantiating, such as `model = paddleclas.PaddleClas(model_name="PPLCNet_x1_0", batch_size=2)`. **Note**: The result returned by model.predict() is a `generator`, so you need to use the `next()` function to call it or `for loop` to loop it. And it will predict with batch_size size batch and return the prediction results when called. The default batch_size is 1, and you also specify the batch_size when instantiating, such as `model = paddleclas.PaddleClas(model_name="PPLCNet_x1_0", batch_size=2)`.
<a name="3"></a> <a name="3"></a>
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册