I read about ongoing efforts to port model to TensorFlow Lite flatbuffers.
Please let me know if TensorFlow Lite support will be first class citizen in DeepSpeech project and you will use only tflite ops in future?
PS. Did you already tried streaming ASR using tflite?
We are using TensorFlow Lite for other inference tasks on device and can help with testing performance on various devices.