Tts-server size & multithreading

Two questions. I plan to use tts-server professionally (many users, many servers).

Q1: Is it multithreaded?
Q2: Is it supposed to be this big? 1.3 to 1.6 gigabytes resident size in Linux. And that’s for one voice… Quite hefty, even for a server.

Greetings

The server provided in the Mozilla-Tts repository is not intended for production use and to my knowledge is not multithreaded.

Regarding Q2 - which version are you referring to?

But mozilla TTS itself is multihreaded?

I am using the version from pip install TTS.

Thanks

It seems to me it shouldn’t take much to make tts-server production ready. It’s a really short Python script that pretty much just connects the TTS to Flask…

Depends on your definition of “multithreaded”. There was some support for training on multiple GPUs, but not for parallel threads during inference.

The TTS package from pypi.org is provided by Coqui.ai - a fork/successor to Mozilla-TTS. You may have a look in to their discussions forum: https://github.com/coqui-ai/TTS/discussions

Hmm, no parallel inference? That’s too bad and might make it hard to use practically… When I create a 1+ GB process for TTS I would sure hope to use it in multiple threads.

OK I’ll check coqui.ai’s forum. Thanks