Load the frozen graph and do inference

Hi, this is my question and it’s actually from an earlier issue in deepspeech’s github.

@reuben I believe commented this on that issue:

Oh, so you want to use TensorFlow to directly load the exported graph? Then you’ll have to use the checkpoint. Our pre-trained models are meant to be used with the native client or their bindings for Python/NodeJS/Rust/etc. Any direct manipulation should be done via the checkpoint rather than the exported model to avoid any weirdness (like the metadata node for example).

How exactly do I convert a checkpoint to a frozen model? I’m using the first DeepSpeech release v0.1.0 if it helps. I have the default trained model (output_graph.pb) but I just can’t seem to convert new checkpoints to frozen models. I have a code to export the data files to a frozen model, but I’m not sure what output nodes to export. I’ve tried using “output_node” and also another one “save/restore_all”. A frozen model is created, sure, but deepspeech displays a “Input 0 of node save/Assign was passed float from b1:0 incompatible with expected float_ref”.

Thank you very much for any help.