onnx转paddle中遇到问题
Created by: CoderAnn
我在onnx转paddle时候遇到了一个问题想请教一下。最开始torch版本是1.1.0转onnx时会报错,之后torch修改为1.0.1后pytorch转onnx没问题了,但是在onnx转paddle时遇到了如下问题,请问该如何解决呢?麻烦了!感谢!(ps:我的网络结构中用到了上采样层)
Now translating model from onnx to paddle. model ir_version: 3, op version: 9 [libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 2147483647 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h. [libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 1585340438 [libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 2147483647 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h. [libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 1585340438 (op_type:Tanh, name:): Inferred elem type differs from existing elem type: (INT64) vs (FLOAT) Traceback (most recent call last): File "/home/vis/hongzhibin/env/anaconda3/bin/x2paddle", line 11, in <module> load_entry_point('x2paddle==0.7.1', 'console_scripts', 'x2paddle')() File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/x2paddle-0.7.1-py3.6.egg/x2paddle/convert.py", line 248, in main onnx2paddle(args.model, args.save_dir, params_merge) File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/x2paddle-0.7.1-py3.6.egg/x2paddle/convert.py", line 172, in onnx2paddle model = ONNXDecoder(model_path) File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/x2paddle-0.7.1-py3.6.egg/x2paddle/decoder/onnx_decoder.py", line 325, in __init__ self.check_model_running_state(onnx_model) File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/x2paddle-0.7.1-py3.6.egg/x2paddle/decoder/onnx_decoder.py", line 481, in check_model_running_state model = onnx.shape_inference.infer_shapes(model) File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/onnx/shape_inference.py", line 35, in infer_shapes inferred_model_str = C.infer_shapes(model_str) RuntimeError: Inferred elem type differs from existing elem type: (INT64) vs (FLOAT)
另外还尝试过torch版本为1.2.0时转onnx,在onnx转paddle时也遇到问题。无论哪种版本能从onnx转到padlle就可以了。
`Traceback (most recent call last):
File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/x2paddle-0.7.1-py3.6.egg/x2paddle/decoder/onnx_decoder.py", line 494, in check_model_running_state
sess = rt.InferenceSession(model_path)
File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/onnxruntime/capi/session.py", line 23, in init
self._load_model()
File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/onnxruntime/capi/session.py", line 35, in _load_model
self._sess.load_model(self._path_or_bytes, providers)
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: /onnxruntime_src/onnxruntime/core/providers/cpu/tensor/upsample.h:209 void onnxruntime::UpsampleBase::ScalesValidation(const std::vector&, onnxruntime::UpsampleMode) const scale >= 1 was false. Scale value should be greater than or equal to 1.
Stacktrace:
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/home/vis/hongzhibin/env/anaconda3/bin/x2paddle", line 11, in load_entry_point('x2paddle==0.7.1', 'console_scripts', 'x2paddle')() File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/x2paddle-0.7.1-py3.6.egg/x2paddle/convert.py", line 248, in main onnx2paddle(args.model, args.save_dir, params_merge) File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/x2paddle-0.7.1-py3.6.egg/x2paddle/convert.py", line 172, in onnx2paddle model = ONNXDecoder(model_path) File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/x2paddle-0.7.1-py3.6.egg/x2paddle/decoder/onnx_decoder.py", line 325, in init self.check_model_running_state(onnx_model) File "/home/vis/hongzhibin/env/anaconda3/lib/python3.6/site-packages/x2paddle-0.7.1-py3.6.egg/x2paddle/decoder/onnx_decoder.py", line 503, in check_model_running_state "onnxruntime inference onnx model failed, Please confirm the correctness of onnx model by onnxruntime, if onnx model is correct, please submit issue in github." Exception: onnxruntime inference onnx model failed, Please confirm the correctness of onnx model by onnxruntime, if onnx model is correct, please submit issue in github.`