deploy/python/infer.py不执行也不报错
Created by: SangerY
win7+cuda10+cudnn7.6,模型为faster_rcnn_r101_vd_fpn_1x,导出模型预测,执行中断也不报错。
使用如下代码导出模型:
python tools/export_model.py -c configs/faster_rcnn/faster_rcnn_r101_vd_fpn_1x.yml
--output_dir=./inference_model
-o weights=output/faster_rcnn_r101_vd_fpn_1x/best_model
再使用python deploy/python/infer.py测试视频,结果如下,也不报错,就自动退了。
Running Arguments
image_file:
model_dir: models/
output_dir: I:/results/
run_benchmark: False
run_mode: fluid
threshold: 0.3
use_gpu: True
video_file: I:/0001.mp4
Model Configuration Model Arch: RCNN Use Padddle Executor: False Transform Order: --transform op: Normalize --transform op: Resize --transform op: Permute --transform op: PadStride
detect frame:1
将infer_cfg.yml中的use_python_inference改为true后,也是直接跳出来了,不报错。 Running Arguments image_file: model_dir: models/ output_dir: I:/results/ run_benchmark: False run_mode: fluid threshold: 0.3 use_gpu: True video_file: I:/0001.mp4
Model Configuration Model Arch: RCNN Use Padddle Executor: True Transform Order: --transform op: Normalize --transform op: Resize --transform op: Permute --transform op: PadStride
W0721 14:08:58.003999 19100 device_context.cc:252] Please NOTE: device: 0, CUDA Capability: 61, Driver API Version: 10.0, Runtime API Version: 10.0 W0721 14:08:58.016000 19100 device_context.cc:260] device: 0, cuDNN Version: 7.6.