TensorFlow Lite inference


(Bartek) #1

Hi!

I read about ongoing efforts to port model to TensorFlow Lite flatbuffers.

Please let me know if TensorFlow Lite support will be first class citizen in DeepSpeech project and you will use only tflite ops in future?

PS. Did you already tried streaming ASR using tflite?

We are using TensorFlow Lite for other inference tasks on device and can help with testing performance on various devices.


(kdavis) #2

TensorFlow Lite support will be first class citizen in DeepSpeech, and we will use only TFLite ops in future. This is scheduled to occur in v0.5.0 which should be out before years end.


(Rene Peinl) #3

Hi there. We are also interested in the TFlite support. Any updates on that?


(Lissyx) #4

It’s already available but not officially endorsed.