Cannot find deepspeech binaries compatible with tensorflow 1.6 to run inference on a trained model

I have trained a model with deepspeech on a tensorflow version 1.6 and running inderence with deepspeech binaries 1.4 because I have not find newer binaries. I am using Mac Os Elcapitan. I get the following expected error:

ghassen$ deepspeech data/results/model_export/output_graph.pb data/alphabet.txt data/results/zweiplaneten5.wav
Loading model from file data/results/model_export/output_graph.pb
Warning: reading entire model file into memory. Transform model file into an mmapped graph to reduce heap usage.
2018-04-20 14:30:02.626837: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
Loaded model in 0.064s.
Running inference.
2018-04-20 14:30:02.936198: E tensorflow/core/common_runtime/executor.cc:651] Executor failed to create kernel. Invalid argument: NodeDef mentions attr ‘index_type’ not in Op<name=Fill; signature=dims:int32, value:T -> output:T; attr=T:type>; NodeDef: bidirectional_rnn/fw/fw/zeros = Fill[T=DT_FLOAT, index_type=DT_INT32, _device="/job:localhost/replica:0/task:0/device:CPU:0"](bidirectional_rnn/fw/fw/concat, bidirectional_rnn/fw/fw/zeros/Const). (Check whether your GraphDef-interpreting binary is up to date with your GraphDef-generating binary.).
[[Node: bidirectional_rnn/fw/fw/zeros = Fill[T=DT_FLOAT, index_type=DT_INT32, _device="/job:localhost/replica:0/task:0/device:CPU:0"](bidirectional_rnn/fw/fw/concat, bidirectional_rnn/fw/fw/zeros/Const)]]
Error running session: Invalid argument: NodeDef mentions attr ‘index_type’ not in Op<name=Fill; signature=dims:int32, value:T -> output:T; attr=T:type>; NodeDef: bidirectional_rnn/fw/fw/zeros = Fill[T=DT_FLOAT, index_type=DT_INT32, _device="/job:localhost/replica:0/task:0/device:CPU:0"](bidirectional_rnn/fw/fw/concat, bidirectional_rnn/fw/fw/zeros/Const). (Check whether your GraphDef-interpreting binary is up to date with your GraphDef-generating binary.).
[[Node: bidirectional_rnn/fw/fw/zeros = Fill[T=DT_FLOAT, index_type=DT_INT32, _device="/job:localhost/replica:0/task:0/device:CPU:0"](bidirectional_rnn/fw/fw/concat, bidirectional_rnn/fw/fw/zeros/Const)]]
None
Inference took 0.252s for 7.220s audio file

I do not want to run training again because it takes so long. Are there binaries I can use for Mac os to run inference on my trained model (TF version:1.6)

As I already replied, it is documented there: https://github.com/mozilla/DeepSpeech/blob/master/native_client/README.md#installation

Thank you. I have not noticed the deepspeech binary file downloaded.