diff --git a/deploy/third_engine/demo_openvino/python/README.md b/deploy/third_engine/demo_openvino/python/README.md index 74a0f1d3293295b1f201ebcbf9a71a5166622cdc..faefbe0d0a0efe27795b61a8d3183710c069a52a 100644 --- a/deploy/third_engine/demo_openvino/python/README.md +++ b/deploy/third_engine/demo_openvino/python/README.md @@ -11,35 +11,30 @@ pip install openvino==2022.1.0 ``` -详细安装步骤,可参考官网: https://docs.openvinotoolkit.org/latest/get_started_guides.html +详细安装步骤,可参考[OpenVINO官网](https://docs.openvinotoolkit.org/latest/get_started_guides.html) ## 测试 -准备测试模型,根据[PicoDet](https://github.com/PaddlePaddle/PaddleDetection/tree/release/2.4/configs/picodet)中模型导出与转换步骤,采用不包含后处理的方式导出模型(`-o export.benchmark=True` ),并生成待测试模型简化后的onnx(可在下文链接中直接下载) -在本目录下新建```out_onnxsim```文件夹: -```shell -mkdir out_onnxsim -``` -将导出的onnx模型放在该目录下 +- 准备测试模型:根据[PicoDet](https://github.com/PaddlePaddle/PaddleDetection/tree/develop/configs/picodet)中【导出及转换模型】步骤,采用不包含后处理的方式导出模型(`-o export.benchmark=True` ),并生成待测试模型简化后的onnx模型(可在下文链接中可直接下载)。同时在本目录下新建```out_onnxsim```文件夹,将导出的onnx模型放在该目录下。 -准备测试所用图片,本demo默认利用PaddleDetection/demo/[000000570688.jpg](https://github.com/PaddlePaddle/PaddleDetection/blob/release/2.4/demo/000000570688.jpg) +- 准备测试所用图片:本demo默认利用PaddleDetection/demo/[000000570688.jpg](https://github.com/PaddlePaddle/PaddleDetection/blob/develop/demo/000000570688.jpg) -在本目录下直接运行: +- 在本目录下直接运行: ```shell -#Windows -python '.\openvino_ppdet2 copy.py' --img_path ..\..\..\..\demo\000000570688.jpg --onnx_path out_onnxsim\picodet_xs_320_coco_lcnet.onnx --in_shape 320 -#Linux -python './openvino_ppdet2 copy.py' --img_path ../../../../demo/000000570688.jpg --onnx_path out_onnxsim/picodet_xs_320_coco_lcnet.onnx --in_shape 320 +# Linux +python openvino_benchmark.py --img_path ../../../../demo/000000570688.jpg --onnx_path out_onnxsim/picodet_xs_320_coco_lcnet.onnx --in_shape 320 +# Windows +python openvino_benchmark.py --img_path ..\..\..\..\demo\000000570688.jpg --onnx_path out_onnxsim\picodet_xs_320_coco_lcnet.onnx --in_shape 320 ``` -注意:```--in_shape```为对应模型输入size,默认为320 +- 注意:```--in_shape```为对应模型输入size,默认为320 ## 结果 -在英特尔酷睿i7 10750H 的CPU(MKLDNN 12线程)上测试结果如下: +测试结果如下: -| 模型 | 输入尺寸 | ONNX | 预测时延[ms](#latency)| +| 模型 | 输入尺寸 | ONNX | 预测时延[CPU](#latency)| | :-------- | :--------: | :---------------------: | :----------------: | | PicoDet-XS | 320*320 | [model](https://paddledet.bj.bcebos.com/deploy/third_engine/picodet_xs_320_coco_lcnet.onnx) | 3.9ms | | PicoDet-XS | 416*416 | [model](https://paddledet.bj.bcebos.com/deploy/third_engine/picodet_xs_416_coco_lcnet.onnx) | 6.1ms | @@ -50,3 +45,5 @@ python './openvino_ppdet2 copy.py' --img_path ../../../../demo/000000570688.jpg | PicoDet-L | 320*320 | [model](https://paddledet.bj.bcebos.com/deploy/third_engine/picodet_l_320_coco_lcnet.onnx) | 11.5ms | | PicoDet-L | 416*416 | [model](https://paddledet.bj.bcebos.com/deploy/third_engine/picodet_l_416_coco_lcnet.onnx) | 20.7ms | | PicoDet-L | 640*640 | [model](https://paddledet.bj.bcebos.com/deploy/third_engine/picodet_l_640_coco.onnx) | 62.5ms | + +- 测试环境: 英特尔酷睿i7 10750H CPU。 diff --git a/deploy/third_engine/demo_openvino/python/openvino_benchmark.py b/deploy/third_engine/demo_openvino/python/openvino_benchmark.py index b852b75817e641eb5e248880dbcc89acbb2381b3..cd987feadb999fc8667ee086f7d0ccecb2cc72b3 100644 --- a/deploy/third_engine/demo_openvino/python/openvino_benchmark.py +++ b/deploy/third_engine/demo_openvino/python/openvino_benchmark.py @@ -19,7 +19,7 @@ import argparse from openvino.runtime import Core -def image_preprocess_mobilenetv3(img_path, re_shape): +def image_preprocess(img_path, re_shape): img = cv2.imread(img_path) img = cv2.resize( img, (re_shape, re_shape), interpolation=cv2.INTER_LANCZOS4) @@ -38,7 +38,7 @@ def benchmark(img_file, onnx_file, re_shape): ie = Core() net = ie.read_model(onnx_file) - test_image = image_preprocess_mobilenetv3(img_file, re_shape) + test_image = image_preprocess(img_file, re_shape) compiled_model = ie.compile_model(net, 'CPU') @@ -64,9 +64,9 @@ def benchmark(img_file, onnx_file, re_shape): time_avg = timeall / loop_num - print( - f'inference_time(ms): min={round(time_min*1000, 2)}, max = {round(time_max*1000, 1)}, avg = {round(time_avg*1000, 1)}' - ) + print('inference_time(ms): min={}, max={}, avg={}'.format( + round(time_min * 1000, 2), + round(time_max * 1000, 1), round(time_avg * 1000, 1))) if __name__ == '__main__':