tensorRT加速报错
Created by: 1145520074
错误如下,请问是什么原因?? --- Running analysis [ir_graph_build_pass] --- Running analysis [ir_analysis_pass] --- Running IR pass [infer_clean_graph_pass] --- Running IR pass [conv_affine_channel_fuse_pass] --- Running IR pass [conv_eltwiseadd_affine_channel_fuse_pass] --- Running IR pass [tensorrt_subgraph_pass] --- detect a sub-graph with 7 nodes I0705 18:12:22.623015 19561 tensorrt_subgraph_pass.cc:252] Prepare TRT engine (Optimize model structure, Select OP kernel etc). This process may cost a lot of time. terminate called after throwing an instance of 'paddle::platform::EnforceNotMet' what(): Enforce failed. Expected shape.size() > 1UL, but received shape.size():1 <= 1UL:1. TensorRT' tensor input requires at least 2 dimensions at [/shixiaowei02/Paddle_1.4.1/Paddle/paddle/fluid/inference/tensorrt/convert/op_converter.h:52] PaddlePaddle Call Stacks: 0 0x7f4e11ae9fa7p void paddle::platform::EnforceNotMet::Initstd::string(std::string, char const*, int) + 1479 1 0x7f4e11aeb6b5p paddle::platform::EnforceNotMet::EnforceNotMet(std::string const&, char const*, int) + 85 2 0x7f4e1274729bp paddle::inference::tensorrt::OpConverter::ConvertBlockToTRTEngine(paddle::framework::BlockDesc*, paddle::framework::Scope const&, std::vector<std::string, std::allocatorstd::string > const&, std::unordered_set<std::string, std::hashstd::string, std::equal_tostd::string, std::allocatorstd::string > const&, std::vector<std::string, std::allocatorstd::string > const&, paddle::inference::tensorrt::TensorRTEngine*) + 1563 3 0x7f4e131b5900p paddle::inference::analysis::TensorRtSubgraphPass::CreateTensorRTOp(paddle::framework::ir::Node*, paddle::framework::ir::Graph*, std::vector<std::string, std::allocatorstd::string > const&, std::vector<std::string, std::allocatorstd::string >) const + 9984 4 0x7f4e131b6fe5p paddle::inference::analysis::TensorRtSubgraphPass::ApplyImpl(paddle::framework::ir::Graph) const + 549 5 0x7f4e13320210p paddle::framework::ir::Pass::Apply(paddle::framework::ir::Graph*) const + 192 6 0x7f4e132c71e0p paddle::inference::analysis::IRPassManager::Apply(std::unique_ptr<paddle::framework::ir::Graph, std::default_deletepaddle::framework::ir::Graph >) + 240 7 0x7f4e132b6d53p paddle::inference::analysis::IrAnalysisPass::RunImpl(paddle::inference::analysis::Argument*) + 755 8 0x7f4e131c21fdp paddle::inference::analysis::Analyzer::RunAnalysis(paddle::inference::analysis::Argument*) + 909 9 0x7f4e11afcba8p paddle::AnalysisPredictor::OptimizeInferenceProgram() + 88 10 0x7f4e11afdd5fp paddle::AnalysisPredictor::PrepareProgram(std::shared_ptrpaddle::framework::ProgramDesc const&) + 351 11 0x7f4e11afdec7p paddle::AnalysisPredictor::Init(std::shared_ptrpaddle::framework::Scope const&, std::shared_ptrpaddle::framework::ProgramDesc const&) + 311 12 0x7f4e11afe32cp std::unique_ptr<paddle::PaddlePredictor, std::default_deletepaddle::PaddlePredictor > paddle::CreatePaddlePredictor<paddle::AnalysisConfig, (paddle::PaddleEngineKind)2>(paddle::AnalysisConfig const&) + 1068 13 0x7f4e11afed81p std::unique_ptr<paddle::PaddlePredictor, std::default_deletepaddle::PaddlePredictor > paddle::CreatePaddlePredictorpaddle::AnalysisConfig(paddle::AnalysisConfig const&) + 17 14 0x7f4e5f131207p 15 0x7f4e5f1320d1p create_cnnpredict + 129 16 0x7f4e5c5461e1p 17 0x7f4e5c549504p 18 0x472cdep 19 0x46dd00p 20 0x467e59p 21 0x4740e3p 22 0x46cee2p 23 0x7f4e652c5bd5p __libc_start_main + 245 24 0x4610f5p