DeepSpeech Latest Results with English

Hello Team, could you please point me to the link, to refer the latest results for WER for pre-trained English DeepSpeech model?

It’s not yet public. But will be in the next couple of days.

1 Like

you are doing great work guys. keep it up…

Now it’s public.


@kdavis: Please also share the approx training data size in hours used to train the model?

The data sets used are indicated in the release notes linked to in my previous comment.

@kdavis is 0.5.0 only trained on Libreespeech data.
what about 22 gb mozilla open corpus.

No. See the release notes.

hi @kdavis
as per release notes

deepspeech model is trained on American English which achieves an 8.22% word error rate on the [LibriSpeech clean test corpus]

it is mentioned that test set is used as librispeech clean test but not clearly mentioned which train set is used to priovide this checkpoints

please help me clear.

The release notes state

train_files Fisher, LibriSpeech, and Switchboard training corpora.

oh I thought that was for fine tuning. anyway thank you so much…