Weekly TTS meetings

We’ve been running weekly TTS meeting for a while. I think it is also better to publicize it here to have better community involvement.

If you like to join, it is better to write your topic to the bottom of this in advance with an estimated time and your name. Then, we’ll go over them at the meeting.

To join the meeting you can follow the zoom link here. It happens every Monday 4:30 – 5:30pm CEST

4 Likes

Wow, meeting notes list so many interesting topics that were discussed. I wish there were longer notes available.

I see someone is working with ONNX conversion. How well does it work? I thought it wasn’t possible to convert dynamic graphs to ONNX - what’s the workaround for TTS?

What’s the conclusion for “LibTorch based inference on TTS”? Does it work? What are the difficulties?

I see a discussion about GST concatenation vs attention. Any conclusions from that?

Unfortunately meeting time doesn’t work for me, otherwise sounds interesting.

All these are in progress. We had a initial success to convert the model to ONNX and LibTorch so far. But more is waiting.

How did you do ONNX conversion form torch model please? I didn’t find anything related to this in Mozilla TTS repo
what should be changed to make that happens?
Thank you

Don’ t have experience with it, but maybe this helps: https://github.com/ToriML/onnx2pytorch

1 Like

Big thanks for your replay,
but I am trying to convert torch to ONNX and Mozilla TTS has its own implementation for PyTorch models, so it not going to be fast forward to convert. I am testing torch.onnx.export module with Mozilla TTS Tacotron2 and raises this error related to forward() argument 'text_lengths'

   File "output/convert_to_onnx.py", line 75, in <module>
        torch.onnx.export(
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/onnx/__init__.py", line 225, in export
        return utils.export(model, args, f, export_params, verbose, training,
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/onnx/utils.py", line 85, in export
        _export(model, args, f, export_params, verbose, training, input_names, output_names,
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/onnx/utils.py", line 632, in _export
        _model_to_graph(model, args, verbose, input_names,
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/onnx/utils.py", line 409, in _model_to_graph
        graph, params, torch_out = _create_jit_graph(model, args,
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/onnx/utils.py", line 379, in _create_jit_graph
        graph, torch_out = _trace_and_get_graph_from_model(model, args)
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/onnx/utils.py", line 342, in _trace_and_get_graph_from_model
        torch.jit._get_trace_graph(model, args, strict=False, _force_outplace=False, _return_inputs_states=True)
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/jit/_trace.py", line 1148, in _get_trace_graph
        outs = ONNXTracedModule(f, strict, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs)
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
        result = self.forward(*input, **kwargs)
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/jit/_trace.py", line 125, in forward
        graph, out = torch._C._create_graph_by_tracing(
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/jit/_trace.py", line 116, in wrapper
        outs.append(self.inner(*trace_inputs))
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 725, in _call_impl
        result = self._slow_forward(*input, **kwargs)
      File "/home/khalil/anaconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 709, in _slow_forward
        result = self.forward(*input, **kwargs)
    TypeError: forward() missing 1 required positional argument: 'text_lengths'