Problem with commit from github and config model / HELP FIND WORKER MODEL AND COMIT FOR TEST TTS

Hi! i can try run tts, but i have some errors, like this below.
But i try different models and configs with different git history, but errors dont leave me :frowning:

Loading TTS model …
| > model config: C:\Users\SDE-02\PycharmProjects\TTS1\config.json
| > model file: checkpoint_272976.pth.tar
Setting up Audio Processor…
| > sample_rate:22050
| > num_mels:80
| > min_level_db:-100
| > frame_shift_ms:12.5
| > frame_length_ms:50
| > ref_level_db:20
| > num_freq:1025
| > power:1.5
| > preemphasis:0.98
| > griffin_lim_iters:60
| > signal_norm:True
| > symmetric_norm:False
| > mel_fmin:0
| > mel_fmax:8000.0
| > max_norm:1.0
| > clip_norm:True
| > do_trim_silence:True
| > n_fft:2048
| > hop_length:275
| > win_length:1102
Using model: Tacotron2
Traceback (most recent call last):
File “C:/Users/SDE-02/PycharmProjects/TTS1/server/test1.py”, line 5, in
synthesizer = Synthesizer(config)
File “C:\Users\SDE-02\PycharmProjects\TTS1\server\synthesizer.py”, line 29, in init
self.load_tts(self.config.tts_path, self.config.tts_file, self.config.tts_config, config.use_cuda)
File “C:\Users\SDE-02\PycharmProjects\TTS1\server\synthesizer.py”, line 61, in load_tts
self.tts_model.load_state_dict(cp[‘model’])
File “C:\Users\SDE-02\AppData\Local\Programs\Python\Python36\lib\site-packages\torch\nn\modules\module.py”, line 845, in load_state_dict
self.class.name, “\n\t”.join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for Tacotron2:
Missing key(s) in state_dict: “encoder.convolutions.0.net.0.weight”, “encoder.convolutions.0.net.0.bias”, “encoder.convolutions.0.net.1.weight”, “encoder.convolutions.0.net.1.bias”, “encoder.convolutions.0.net.1.running_mean”, “encoder.convolutions.0.net.1.running_var”, “encoder.convolutions.1.net.0.weight”, “encoder.convolutions.1.net.0.bias”, “encoder.convolutions.1.net.1.weight”, “encoder.convolutions.1.net.1.bias”, “encoder.convolutions.1.net.1.running_mean”, “encoder.convolutions.1.net.1.running_var”, “encoder.convolutions.2.net.0.weight”, “encoder.convolutions.2.net.0.bias”, “encoder.convolutions.2.net.1.weight”, “encoder.convolutions.2.net.1.bias”, “encoder.convolutions.2.net.1.running_mean”, “encoder.convolutions.2.net.1.running_var”, “encoder.lstm.weight_ih_l0”, “encoder.lstm.weight_hh_l0”, “encoder.lstm.bias_ih_l0”, “encoder.lstm.bias_hh_l0”, “encoder.lstm.weight_ih_l0_reverse”, “encoder.lstm.weight_hh_l0_reverse”, “encoder.lstm.bias_ih_l0_reverse”, “encoder.lstm.bias_hh_l0_reverse”, “decoder.prenet.layers.0.linear_layer.weight”, “decoder.prenet.layers.1.linear_layer.weight”, “decoder.attention_rnn.weight_ih”, “decoder.attention_rnn.weight_hh”, “decoder.attention_rnn.bias_ih”, “decoder.attention_rnn.bias_hh”, “decoder.attention_layer.query_layer.linear_layer.weight”, “decoder.attention_layer.inputs_layer.linear_layer.weight”, “decoder.attention_layer.v.linear_layer.weight”, “decoder.attention_layer.v.linear_layer.bias”, “decoder.decoder_rnn.weight_ih”, “decoder.decoder_rnn.weight_hh”, “decoder.decoder_rnn.bias_ih”, “decoder.decoder_rnn.bias_hh”, “decoder.linear_projection.linear_layer.weight”, “decoder.linear_projection.linear_layer.bias”, “decoder.stopnet.1.linear_layer.weight”, “decoder.stopnet.1.linear_layer.bias”, “decoder.attention_rnn_init.weight”, “decoder.go_frame_init.weight”, “decoder.decoder_rnn_inits.weight”, “postnet.convolutions.0.net.0.weight”, “postnet.convolutions.0.net.0.bias”, “postnet.convolutions.0.net.1.weight”, “postnet.convolutions.0.net.1.bias”, “postnet.convolutions.0.net.1.running_mean”, “postnet.convolutions.0.net.1.running_var”, “postnet.convolutions.1.net.0.weight”, “postnet.convolutions.1.net.0.bias”, “postnet.convolutions.1.net.1.weight”, “postnet.convolutions.1.net.1.bias”, “postnet.convolutions.1.net.1.running_mean”, “postnet.convolutions.1.net.1.running_var”, “postnet.convolutions.2.net.0.weight”, “postnet.convolutions.2.net.0.bias”, “postnet.convolutions.2.net.1.weight”, “postnet.convolutions.2.net.1.bias”, “postnet.convolutions.2.net.1.running_mean”, “postnet.convolutions.2.net.1.running_var”, “postnet.convolutions.3.net.0.weight”, “postnet.convolutions.3.net.0.bias”, “postnet.convolutions.3.net.1.weight”, “postnet.convolutions.3.net.1.bias”, “postnet.convolutions.3.net.1.running_mean”, “postnet.convolutions.3.net.1.running_var”, “postnet.convolutions.4.net.0.weight”, “postnet.convolutions.4.net.0.bias”, “postnet.convolutions.4.net.1.weight”, “postnet.convolutions.4.net.1.bias”, “postnet.convolutions.4.net.1.running_mean”, “postnet.convolutions.4.net.1.running_var”.
Unexpected key(s) in state_dict: “last_linear.weight”, “last_linear.bias”, “encoder.prenet.layers.0.weight”, “encoder.prenet.layers.0.bias”, “encoder.prenet.layers.1.weight”, “encoder.prenet.layers.1.bias”, “encoder.cbhg.conv1d_banks.0.conv1d.weight”, “encoder.cbhg.conv1d_banks.0.bn.weight”, “encoder.cbhg.conv1d_banks.0.bn.bias”, “encoder.cbhg.conv1d_banks.0.bn.running_mean”, “encoder.cbhg.conv1d_banks.0.bn.running_var”, “encoder.cbhg.conv1d_banks.1.conv1d.weight”, “encoder.cbhg.conv1d_banks.1.bn.weight”, “encoder.cbhg.conv1d_banks.1.bn.bias”, “encoder.cbhg.conv1d_banks.1.bn.running_mean”, “encoder.cbhg.conv1d_banks.1.bn.running_var”, “encoder.cbhg.conv1d_banks.2.conv1d.weight”, “encoder.cbhg.conv1d_banks.2.bn.weight”, “encoder.cbhg.conv1d_banks.2.bn.bias”, “encoder.cbhg.conv1d_banks.2.bn.running_mean”, “encoder.cbhg.conv1d_banks.2.bn.running_var”, “encoder.cbhg.conv1d_banks.3.conv1d.weight”, “encoder.cbhg.conv1d_banks.3.bn.weight”, “encoder.cbhg.conv1d_banks.3.bn.bias”, “encoder.cbhg.conv1d_banks.3.bn.running_mean”, “encoder.cbhg.conv1d_banks.3.bn.running_var”, “encoder.cbhg.conv1d_banks.4.conv1d.weight”, “encoder.cbhg.conv1d_banks.4.bn.weight”, “encoder.cbhg.conv1d_banks.4.bn.bias”, “encoder.cbhg.conv1d_banks.4.bn.running_mean”, “encoder.cbhg.conv1d_banks.4.bn.running_var”, “encoder.cbhg.conv1d_banks.5.conv1d.weight”, “encoder.cbhg.conv1d_banks.5.bn.weight”, “encoder.cbhg.conv1d_banks.5.bn.bias”, “encoder.cbhg.conv1d_banks.5.bn.running_mean”, “encoder.cbhg.conv1d_banks.5.bn.running_var”, “encoder.cbhg.conv1d_banks.6.conv1d.weight”, “encoder.cbhg.conv1d_banks.6.bn.weight”, “encoder.cbhg.conv1d_banks.6.bn.bias”, “encoder.cbhg.conv1d_banks.6.bn.running_mean”, “encoder.cbhg.conv1d_banks.6.bn.running_var”, “encoder.cbhg.conv1d_banks.7.conv1d.weight”, “encoder.cbhg.conv1d_banks.7.bn.weight”, “encoder.cbhg.conv1d_banks.7.bn.bias”, “encoder.cbhg.conv1d_banks.7.bn.running_mean”, “encoder.cbhg.conv1d_banks.7.bn.running_var”, “encoder.cbhg.conv1d_banks.8.conv1d.weight”, “encoder.cbhg.conv1d_banks.8.bn.weight”, “encoder.cbhg.conv1d_banks.8.bn.bias”, “encoder.cbhg.conv1d_banks.8.bn.running_mean”, “encoder.cbhg.conv1d_banks.8.bn.running_var”, “encoder.cbhg.conv1d_banks.9.conv1d.weight”, “encoder.cbhg.conv1d_banks.9.bn.weight”, “encoder.cbhg.conv1d_banks.9.bn.bias”, “encoder.cbhg.conv1d_banks.9.bn.running_mean”, “encoder.cbhg.conv1d_banks.9.bn.running_var”, “encoder.cbhg.conv1d_banks.10.conv1d.weight”, “encoder.cbhg.conv1d_banks.10.bn.weight”, “encoder.cbhg.conv1d_banks.10.bn.bias”, “encoder.cbhg.conv1d_banks.10.bn.running_mean”, “encoder.cbhg.conv1d_banks.10.bn.running_var”, “encoder.cbhg.conv1d_banks.11.conv1d.weight”, “encoder.cbhg.conv1d_banks.11.bn.weight”, “encoder.cbhg.conv1d_banks.11.bn.bias”, “encoder.cbhg.conv1d_banks.11.bn.running_mean”, “encoder.cbhg.conv1d_banks.11.bn.running_var”, “encoder.cbhg.conv1d_banks.12.conv1d.weight”, “encoder.cbhg.conv1d_banks.12.bn.weight”, “encoder.cbhg.conv1d_banks.12.bn.bias”, “encoder.cbhg.conv1d_banks.12.bn.running_mean”, “encoder.cbhg.conv1d_banks.12.bn.running_var”, “encoder.cbhg.conv1d_banks.13.conv1d.weight”, “encoder.cbhg.conv1d_banks.13.bn.weight”, “encoder.cbhg.conv1d_banks.13.bn.bias”, “encoder.cbhg.conv1d_banks.13.bn.running_mean”, “encoder.cbhg.conv1d_banks.13.bn.running_var”, “encoder.cbhg.conv1d_banks.14.conv1d.weight”, “encoder.cbhg.conv1d_banks.14.bn.weight”, “encoder.cbhg.conv1d_banks.14.bn.bias”, “encoder.cbhg.conv1d_banks.14.bn.running_mean”, “encoder.cbhg.conv1d_banks.14.bn.running_var”, “encoder.cbhg.conv1d_banks.15.conv1d.weight”, “encoder.cbhg.conv1d_banks.15.bn.weight”, “encoder.cbhg.conv1d_banks.15.bn.bias”, “encoder.cbhg.conv1d_banks.15.bn.running_mean”, “encoder.cbhg.conv1d_banks.15.bn.running_var”, “encoder.cbhg.conv1d_projections.0.conv1d.weight”, “encoder.cbhg.conv1d_projections.0.bn.weight”, “encoder.cbhg.conv1d_projections.0.bn.bias”, “encoder.cbhg.conv1d_projections.0.bn.running_mean”, “encoder.cbhg.conv1d_projections.0.bn.running_var”, “encoder.cbhg.conv1d_projections.1.conv1d.weight”, “encoder.cbhg.conv1d_projections.1.bn.weight”, “encoder.cbhg.conv1d_projections.1.bn.bias”, “encoder.cbhg.conv1d_projections.1.bn.running_mean”, “encoder.cbhg.conv1d_projections.1.bn.running_var”, “encoder.cbhg.pre_highway.weight”, “encoder.cbhg.highways.0.H.weight”, “encoder.cbhg.highways.0.H.bias”, “encoder.cbhg.highways.0.T.weight”, “encoder.cbhg.highways.0.T.bias”, “encoder.cbhg.highways.1.H.weight”, “encoder.cbhg.highways.1.H.bias”, “encoder.cbhg.highways.1.T.weight”, “encoder.cbhg.highways.1.T.bias”, “encoder.cbhg.highways.2.H.weight”, “encoder.cbhg.highways.2.H.bias”, “encoder.cbhg.highways.2.T.weight”, “encoder.cbhg.highways.2.T.bias”, “encoder.cbhg.highways.3.H.weight”, “encoder.cbhg.highways.3.H.bias”, “encoder.cbhg.highways.3.T.weight”, “encoder.cbhg.highways.3.T.bias”, “encoder.cbhg.gru.weight_ih_l0”, “encoder.cbhg.gru.weight_hh_l0”, “encoder.cbhg.gru.bias_ih_l0”, “encoder.cbhg.gru.bias_hh_l0”, “encoder.cbhg.gru.weight_ih_l0_reverse”, “encoder.cbhg.gru.weight_hh_l0_reverse”, “encoder.cbhg.gru.bias_ih_l0_reverse”, “encoder.cbhg.gru.bias_hh_l0_reverse”, “decoder.project_to_decoder_in.weight”, “decoder.project_to_decoder_in.bias”, “decoder.decoder_rnns.0.weight_ih”, “decoder.decoder_rnns.0.weight_hh”, “decoder.decoder_rnns.0.bias_ih”, “decoder.decoder_rnns.0.bias_hh”, “decoder.decoder_rnns.1.weight_ih”, “decoder.decoder_rnns.1.weight_hh”, “decoder.decoder_rnns.1.bias_ih”, “decoder.decoder_rnns.1.bias_hh”, “decoder.proj_to_mel.weight”, “decoder.proj_to_mel.bias”, “decoder.prenet.layers.0.weight”, “decoder.prenet.layers.0.bias”, “decoder.prenet.layers.1.weight”, “decoder.prenet.layers.1.bias”, “decoder.attention_rnn.rnn_cell.weight_ih”, “decoder.attention_rnn.rnn_cell.weight_hh”, “decoder.attention_rnn.rnn_cell.bias_ih”, “decoder.attention_rnn.rnn_cell.bias_hh”, “decoder.attention_rnn.alignment_model.query_layer.weight”, “decoder.attention_rnn.alignment_model.query_layer.bias”, “decoder.attention_rnn.alignment_model.annot_layer.weight”, “decoder.attention_rnn.alignment_model.annot_layer.bias”, “decoder.attention_rnn.alignment_model.v.weight”, “decoder.stopnet.rnn.weight_ih”, “decoder.stopnet.rnn.weight_hh”, “decoder.stopnet.rnn.bias_ih”, “decoder.stopnet.rnn.bias_hh”, “decoder.stopnet.linear.weight”, “decoder.stopnet.linear.bias”, “postnet.conv1d_banks.0.conv1d.weight”, “postnet.conv1d_banks.0.bn.weight”, “postnet.conv1d_banks.0.bn.bias”, “postnet.conv1d_banks.0.bn.running_mean”, “postnet.conv1d_banks.0.bn.running_var”, “postnet.conv1d_banks.1.conv1d.weight”, “postnet.conv1d_banks.1.bn.weight”, “postnet.conv1d_banks.1.bn.bias”, “postnet.conv1d_banks.1.bn.running_mean”, “postnet.conv1d_banks.1.bn.running_var”, “postnet.conv1d_banks.2.conv1d.weight”, “postnet.conv1d_banks.2.bn.weight”, “postnet.conv1d_banks.2.bn.bias”, “postnet.conv1d_banks.2.bn.running_mean”, “postnet.conv1d_banks.2.bn.running_var”, “postnet.conv1d_banks.3.conv1d.weight”, “postnet.conv1d_banks.3.bn.weight”, “postnet.conv1d_banks.3.bn.bias”, “postnet.conv1d_banks.3.bn.running_mean”, “postnet.conv1d_banks.3.bn.running_var”, “postnet.conv1d_banks.4.conv1d.weight”, “postnet.conv1d_banks.4.bn.weight”, “postnet.conv1d_banks.4.bn.bias”, “postnet.conv1d_banks.4.bn.running_mean”, “postnet.conv1d_banks.4.bn.running_var”, “postnet.conv1d_banks.5.conv1d.weight”, “postnet.conv1d_banks.5.bn.weight”, “postnet.conv1d_banks.5.bn.bias”, “postnet.conv1d_banks.5.bn.running_mean”, “postnet.conv1d_banks.5.bn.running_var”, “postnet.conv1d_banks.6.conv1d.weight”, “postnet.conv1d_banks.6.bn.weight”, “postnet.conv1d_banks.6.bn.bias”, “postnet.conv1d_banks.6.bn.running_mean”, “postnet.conv1d_banks.6.bn.running_var”, “postnet.conv1d_banks.7.conv1d.weight”, “postnet.conv1d_banks.7.bn.weight”, “postnet.conv1d_banks.7.bn.bias”, “postnet.conv1d_banks.7.bn.running_mean”, “postnet.conv1d_banks.7.bn.running_var”, “postnet.conv1d_projections.0.conv1d.weight”, “postnet.conv1d_projections.0.bn.weight”, “postnet.conv1d_projections.0.bn.bias”, “postnet.conv1d_projections.0.bn.running_mean”, “postnet.conv1d_projections.0.bn.running_var”, “postnet.conv1d_projections.1.conv1d.weight”, “postnet.conv1d_projections.1.bn.weight”, “postnet.conv1d_projections.1.bn.bias”, “postnet.conv1d_projections.1.bn.running_mean”, “postnet.conv1d_projections.1.bn.running_var”, “postnet.pre_highway.weight”, “postnet.highways.0.H.weight”, “postnet.highways.0.H.bias”, “postnet.highways.0.T.weight”, “postnet.highways.0.T.bias”, “postnet.highways.1.H.weight”, “postnet.highways.1.H.bias”, “postnet.highways.1.T.weight”, “postnet.highways.1.T.bias”, “postnet.highways.2.H.weight”, “postnet.highways.2.H.bias”, “postnet.highways.2.T.weight”, “postnet.highways.2.T.bias”, “postnet.highways.3.H.weight”, “postnet.highways.3.H.bias”, “postnet.highways.3.T.weight”, “postnet.highways.3.T.bias”, “postnet.gru.weight_ih_l0”, “postnet.gru.weight_hh_l0”, “postnet.gru.bias_ih_l0”, “postnet.gru.bias_hh_l0”, “postnet.gru.weight_ih_l0_reverse”, “postnet.gru.weight_hh_l0_reverse”, “postnet.gru.bias_ih_l0_reverse”, “postnet.gru.bias_hh_l0_reverse”.
size mismatch for embedding.weight: copying a param with shape torch.Size([149, 256]) from checkpoint, the shape in current model is torch.Size([130, 512]).

your character set and the model’s are different. You need to checkout the right TTS version given on the models table.