error occured when inferenced using my own generated language model?
Created by: wujsy
Hi, considering of speical area, I trained and generated a binary language model using kenlm tool, but errors occured when inferenced in deepspeech: I0720 14:15:50.371575 9838 Util.cpp:166] commandline: --use_gpu=1 --rnn_use_batch=True --trainer_count=8 [INFO 2018-07-20 14:15:57,485 layers.py:2689] output for conv_0: c = 32, h = 81, w = 54, size = 139968 [INFO 2018-07-20 14:15:57,486 layers.py:3251] output for batch_norm_0: c = 32, h = 81, w = 54, size = 139968 [INFO 2018-07-20 14:15:57,487 layers.py:7409] output for scale_sub_region_0: c = 32, h = 81, w = 54, size = 139968 [INFO 2018-07-20 14:15:57,488 layers.py:2689] output for conv_1: c = 32, h = 41, w = 54, size = 70848 [INFO 2018-07-20 14:15:57,488 layers.py:3251] output for batch_norm_1: c = 32, h = 41, w = 54, size = 70848 [INFO 2018-07-20 14:15:57,489 layers.py:7409] output for scale_sub_region_1: c = 32, h = 41, w = 54, size = 70848 [INFO 2018-07-20 14:16:02,241 model.py:243] begin to initialize the external scorer for decoding Loading the LM will be faster if you build a binary file. Reading /data/chensong/lm_arpa/words_forum_o5.arpa ----5---10---15---20---25---30---35---40---45---50---55---60---65---70---75---80---85---90---95--100
[INFO 2018-07-20 14:25:22,499 model.py:253] language model: is_character_based = 0, max_order = 3, dict_size = 6950728 [INFO 2018-07-20 14:25:22,500 model.py:254] end initializing scorer [INFO 2018-07-20 14:25:22,501 test.py:98] start evaluation ... I0720 14:25:27.228785 9838 MultiGradientMachine.cpp:99] numLogicalDevices=1 numThreads=8 numDevices=8 F0720 14:25:27.982380 9991 Vector.cpp:266] Check failed: src.getSize() == this->getSize() (4596 vs. 4712)
* Check failure stack trace: *
F0720 14:25:27.984078 9987 Vector.cpp:266] Check failed: src.getSize() == this->getSize() (9412608 vs. 9650176)* Check failure stack trace: *
@ 0x7fc11dc5927d google::LogMessage::Fail() @ 0x7fc11dc5927d google::LogMessage::Fail() @ 0x7fc11dc5cd2c google::LogMessage::SendToLog() @ 0x7fc11dc5cd2c google::LogMessage::SendToLog() @ 0x7fc11dc58da3 google::LogMessage::Flush() @ 0x7fc11dc58da3 google::LogMessage::Flush() @ 0x7fc11dc5e23e google::LogMessageFatal::~LogMessageFatal() @ 0x7fc11da5e17a paddle::GpuVectorT<>::copyFrom() @ 0x7fc11d9801f9 paddle::TrainerThread::valueDispatchThread() @ 0x7fc11dc5e23e google::LogMessageFatal::~LogMessageFatal() @ 0x7fc1861f7c80 (unknown) @ 0x7fc11da5e17a paddle::GpuVectorT<>::copyFrom() @ 0x7fc18eeda6ba start_thread @ 0x7fc18ec103dd clone @ 0x7fc11d9801f9 paddle::TrainerThread::valueDispatchThread() @ (nil) (unknown) Aborted (core dumped)As @kuke said in issue #161, "The decoder supports language model both in binary and arpa format." ,the same error happened when used arpa format.
do anyone have ideas? I'll appreciated.