Loss is infinity during test dataset

Hey @lissyx @reuben @kdavis

Hope you all are doing well.

Currently, I trained my model on 6 lacs audio dataset on deepspeech 0.4.1
It trained for 12 epoch and started from 30 epoch till 42.
Training loss is 2.8
Validation loss is 4.19

But the result what I am getting on the test dataset is weird i.e

Test - WER: 0.296655, CER: 0.085394, loss: inf

How infinity loss is possible when WER and CER are good? And why I m receiving loss as infinity just only on the dataset. Due to corrupt file or something? Kindly help. Thanks in advance

Usually infinite loss is a signal of a learning rate that is too high. What was your learning rate?

learning rate 0.0001

But i havent received infinite loss on training or validation.
Only receiving it on test dataset.
Still WER and CER make sense.
Also can i understand why only 10 results appear after test result? Are they top 10 or what?

I think so corrupt file causing infinity on test data. Is there anyway to find corrupt file which is causing infinity?