Using RAM and GPU RAM

Hello, everyone. I try to test DeepSpeech pretrained models.

When I install deepspeech-gpu library and load model(deepspeech-0.7.1-models.pbmm) it take all my GPU-RAM.
And when I install deepspeech (without gpu usage) library and load model(deepspeech-0.7.1-models.pbmm) it take near 100Mb of RAM on my computer.

Can anyone explain me why it take less RAM without GPU? Is it super optimized? Because I don`t understand why it take 100Mb of RAM if language model itself takes near 180 Mb.

TensorFlow by default allocates all of the available GPU memory as a single block and manages internal allocations itself.

If all you want to do is inference and not train any models, you’ll be fine with the non-GPU version.

Thank you for answering)

Thanks, i understand this.
I just want understand why DeepSpeech take such small part of memory) It’s just my curiosity :innocent: