Great! That gets back to the 28s mark on the test file.
There are some minor differences in the output vs the 0.6.1 armv7 package. I assume these don’t matter (artifact of a one-off test build, current tree vs release version, etc), but just in case it’s something you want to address, Test apparently has an older TensorFlow (1.12 vs 1.14) and is missing the “INFO: Initialized TensorFlow Lite”:
Raspberry armv7 output:
Loading model from file deepspeech-0.6.1-models/output_graph.tflite TensorFlow: v1.14.0-21-ge77504a DeepSpeech: v0.6.1-0-g3df20fe INFO: Initialized TensorFlow Lite runtime. Loaded model in 0.00548s. Loading language model from files deepspeech-0.6.1-models/lm.binary deepspeech-0.6.1-models/trie Loaded language model in 0.00236s. Running inference.
aarch64 test file
Loading model from file deepspeech-0.6.1-models/output_graph.tflite TensorFlow: v1.12.0-22283-g917d341 DeepSpeech: v0.6.1-alpha.0-80-g5a509f5 Loaded model in 0.00142s. Loading language model from files deepspeech-0.6.1-models/lm.binary deepspeech-0.6.1-models/trie Loaded language model in 0.00042s. Running inference.
Thanks Again!