Decoder stopped with 'max_decoder_steps

Hi,
I trained the model with the tacotron1 config file. The output results are good but in some cases at the inference time, some of the sentences have not been decoded correctly with the following warnings:
“Decoder stopped with 'max_decoder_steps”

how can we manage these sentences to decode correctly? or how can deal with this problem?

AFAIK this means that the model can’t create that sentences, as it is too long/complex. So you can either train with more material or you split up the sentences into smaller chunks.

In some simple and short sentences, this warning may also occur.

For example in one sentence with this warning, changing just one character can resolve this warning.
I can’t understand its reasons.

Please read about Transformers and how they work. In a way you are translating into a pseudo voice and back again. And in the decoder you can’t translate back because you exceeded the max decoder steps.