Could not create model for v0.6.0

I have built an arabic language model before, with the v0.3.0. I could train, infer … etc.

Now I have moved to v0.6.0, with a fresh env, restarted the training, and exported a model. When I try to use the new native client with command:

./native_client/deepspeech --model model_0.6.pbmm --lm lm.binary --trie lm.trie --audio wav/002255_test.wav

I get this error that I cannot escape:

TensorFlow: v1.14.0-21-ge77504a
DeepSpeech: v0.6.0-0-g6d43e21
2019-12-19 16:28:21.312176: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
Unable to fetch graph version: Invalid argument: Session was not created with a graph before Run()!
Could not create model.

When I googled the error, it seems like older incompatible files exist somewhere. But the above log message shows the correct version for TF and DeepSpeech. I have also used the new ./native/client/generate_trie to update the lm.trie file.

Seems like I only have lm.binary left from the older format, does it need an update? How can I update it?

Re-regenate it, look at data/lm/ for examples.

@tarekeldeeb Take a look at our repo https://github.com/Tarteel-io/deepspeech for a generated lm.binary and associated files :slight_smile:

I’ve been running into issues myself (see post here). Let me know if you’re able to get up and running using the generated binary and trie from our repo.

Thanks @lissyx for your reply.
I regenerated the lm.binary using

/kenlm_orig/build/bin/build_binary -a 255 -q 8 trie myLM.arpa lm.binary

But ended up with the same error. The problem is now solved anyway.

When I use the model.pb (not the converted pbmm) everything works as expected. That seems to be an incompatibility from 0.3.

Regards

Thanks Anas, binary LM cannot be just shared. It tried it anyway (with my graph) but did not work as expected.

As my issue is now fixed, I have shared my pb. You can try inferring.

That’s excellent to hear!
We’re able to train as well. Will keep you posted.

We’re you able to run on 0.6.0 tho or 0.5.1?