I know it’s a topic widely discussed but as of now on this release ( DeepSpeech 0.8.2) i can’t manage to generate the trie file for transfer learning.
- Is the transfer-learning branch still the recommended way? (in the original docs there is no mention that we should clone that branch and not the master branch for transfer-learning)
2.Which is the last version that contains that utility in order to be able to build the trie?
I understood that trie is no longer needed when training from scratch but after transfer learning i’m getting this error :
I STARTING Optimization I FINISHED optimization in 0:00:00.000007 terminate called after throwing an instance of 'lm::FormatLoadException' what(): ../kenlm/lm/model.cc:70 in lm::ngram::detail::GenericModel<Search, VocabularyT>::GenericModel(const char*, const lm::ngram::Config&) [with Search = lm::ngram::trie::TrieSearch<lm::ngram::SeparatelyQuantize, lm::ngram::trie::DontBhiksha>; VocabularyT = lm::ngram::SortedVocabulary] threw FormatLoadException because new_config.enumerate_vocab && !parameters.fixed.has_vocabulary'. The decoder requested all the vocabulary strings, but this binary file does not have them. You may need to rebuild the binary file with an updated version of build_binary.