The decoders for deployment developed in C++ are a better alternative for the prototype decoders in Pytthon, with more powerful performance in both speed and accuracy.
### Installation
### Installation
The build of the decoder for deployment depends on several open-sourced projects, first clone or download them to current directory (i.e., `deep_speech_2/deploy`)
The build depends on several open-sourced projects, first clone or download them to current directory (i.e., `deep_speech_2/deploy`)
-[**KenLM**](https://github.com/kpu/kenlm/): Faster and Smaller Language Model Queries
-[**KenLM**](https://github.com/kpu/kenlm/): Faster and Smaller Language Model Queries
The decoders for deployment share almost the same interface with the prototye decoders in Python. After the installation succeeds, these decoders are very convenient for call in Python, and a complete example in ```deploy.py``` can be refered.