Error: You are not allowed to load partial data via load_combine_op, use load_op instead
Created by: wueching
λ 8aa2aa3cf875 /DeepSpeech/examples/tiny ./run_infer_golden.sh Download language model ... ./zh_giga.no_cna_cmn.prune01244.klm already exists, download skipped. Download Aishell model ... ./aishell_model_fluid.tar.gz already exists, download skipped. mean_std.npz README.md vocab.txt params.pdparams grep: warning: GREP_OPTIONS is deprecated; please use an alias or script ----------- Configuration Arguments ----------- alpha: 2.5 beam_size: 500 beta: 0.3 cutoff_prob: 1.0 cutoff_top_n: 40 decoding_method: ctc_beam_search error_rate_type: cer infer_manifest: data/aishell/manifest.test lang_model_path: models/lm/zh_giga.no_cna_cmn.prune01244.klm mean_std_path: models/aishell/mean_std.npz model_path: models/aishell num_conv_layers: 2 num_proc_bsearch: 8 num_rnn_layers: 3 num_samples: 10 rnn_layer_size: 2048 share_rnn_weights: 1 specgram_type: linear use_gpu: 0 use_gru: 0 vocab_path: models/aishell/vocab.txt
2020-01-03 01:31:14,458-INFO: begin to initialize the external scorer for decoding 2020-01-03 01:31:14,612-INFO: language model: is_character_based = 1, max_order = 5, dict_size = 0 2020-01-03 01:31:14,612-INFO: end initializing scorer 2020-01-03 01:31:14,613-INFO: start inference ... /usr/local/lib/python2.7/dist-packages/paddle/fluid/executor.py:779: UserWarning: The following exception is not an EOF exception. "The following exception is not an EOF exception.") Traceback (most recent call last): File "infer.py", line 152, in main() File "infer.py", line 148, in main infer() File "infer.py", line 124, in infer feeding_dict=data_generator.feeding) File "/DeepSpeech/model_utils/model.py", line 412, in infer_batch_probs self.init_from_pretrained_model(exe, infer_program) File "/DeepSpeech/model_utils/model.py", line 161, in init_from_pretrained_model filename="params.pdparams") File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/io.py", line 798, in load_params filename=filename) File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/io.py", line 682, in load_vars filename=filename) File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/io.py", line 726, in load_vars executor.run(load_prog) File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/executor.py", line 780, in run six.reraise(*sys.exc_info()) File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/executor.py", line 775, in run use_program_cache=use_program_cache) File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/executor.py", line 822, in _run_impl use_program_cache=use_program_cache) File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/executor.py", line 899, in _run_program fetch_var_name) paddle.fluid.core_avx.EnforceNotMet:
C++ Call Stacks (More useful to developers):
0 std::string paddle::platform::GetTraceBackString<char const*>(char const*&&, char const*, int) 1 paddle::platform::EnforceNotMet::EnforceNotMet(std::__exception_ptr::exception_ptr, char const*, int) 2 paddle::operators::LoadCombineOpKernel<paddle::platform::CPUDeviceContext, float>::LoadParamsFromBuffer(paddle::framework::ExecutionContext const&, paddle::platform::Place const&, std::istream*, bool, std::vector<std::string, std::allocatorstd::string > const&) const 3 paddle::operators::LoadCombineOpKernel<paddle::platform::CPUDeviceContext, float>::Compute(paddle::framework::ExecutionContext const&) const 4 std::_Function_handler<void (paddle::framework::ExecutionContext const&), paddle::framework::OpKernelRegistrarFunctor<paddle::platform::CPUPlace, false, 0ul, paddle::operators::LoadCombineOpKernel<paddle::platform::CPUDeviceContext, float>, paddle::operators::LoadCombineOpKernel<paddle::platform::CPUDeviceContext, double>, paddle::operators::LoadCombineOpKernel<paddle::platform::CPUDeviceContext, int>, paddle::operators::LoadCombineOpKernel<paddle::platform::CPUDeviceContext, signed char>, paddle::operators::LoadCombineOpKernel<paddle::platform::CPUDeviceContext, long> >::operator()(char const*, char const*, int) const::{lambda(paddle::framework::ExecutionContext const&)#1 (closed)}>::_M_invoke(std::_Any_data const&, paddle::framework::ExecutionContext const&) 5 paddle::framework::OperatorWithKernel::RunImpl(paddle::framework::Scope const&, paddle::platform::Place const&, paddle::framework::RuntimeContext*) const 6 paddle::framework::OperatorWithKernel::RunImpl(paddle::framework::Scope const&, paddle::platform::Place const&) const 7 paddle::framework::OperatorBase::Run(paddle::framework::Scope const&, paddle::platform::Place const&) 8 paddle::framework::Executor::RunPreparedContext(paddle::framework::ExecutorPrepareContext*, paddle::framework::Scope*, bool, bool, bool) 9 paddle::framework::Executor::Run(paddle::framework::ProgramDesc const&, paddle::framework::Scope*, int, bool, bool, std::vector<std::string, std::allocatorstd::string > const&, bool)
Python Call Stacks (More useful to users):
File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/framework.py", line 2488, in append_op attrs=kwargs.get("attrs", None)) File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/io.py", line 725, in load_vars attrs={'file_path': os.path.join(load_dirname, filename)}) File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/io.py", line 682, in load_vars filename=filename) File "/usr/local/lib/python2.7/dist-packages/paddle/fluid/io.py", line 798, in load_params filename=filename) File "/DeepSpeech/model_utils/model.py", line 161, in init_from_pretrained_model filename="params.pdparams") File "/DeepSpeech/model_utils/model.py", line 412, in infer_batch_probs self.init_from_pretrained_model(exe, infer_program) File "infer.py", line 124, in infer feeding_dict=data_generator.feeding) File "infer.py", line 148, in main infer() File "infer.py", line 152, in main()
Error Message Summary:
Error: You are not allowed to load partial data via load_combine_op, use load_op instead. at (/paddle/paddle/fluid/operators/load_combine_op.h:105) [operator < load_combine > error] Failed in inference!
How to solve this problem?thanks.