Inference hangs when running code in a Process

(Kirill) #1

I wanted to try how inference work when I run it in a process. My goal is to split a big file into chunks and then run inference for chunks in different processes. When I started testing it I mentioned that model inference hangs when running code in process. I modified to reproduce issue. First I’m checking that model can run inference and then run code in process:

print(ds.stt(audio, fs))
print('Running inference.', file=sys.stderr)
inference_start = timer()

from multiprocessing import Process

def func(model, audio, fs, res):
    res.append(model.stt(audio, fs))

res = []
p = Process(target=func, args=(ds, audio, fs, res,))
inference_end = timer() - inference_start
print('Inference took %0.3fs for %0.3fs audio file.' % (inference_end, audio_length), file=sys.stderr)

I’m using version 0.3.0. Would be great if you can explain this behavior.

(Lissyx) #2

This is some code that we do run without any issue, so it’d be great if you could document more on your context: audio files / amount of data, what do you mean by “hangs”, etc.

(Kirill) #3

Thanks for reply. Here is full gist of main function of native_client/python/ file and console output:

model.stt(…) method doesn’t return anything, it’s stuck.

(Lissyx) #4

Sharing model accross process, I’m not so sure what could happen. Also, you need to document how much data you have to process …

(Kirill) #5

I might misunderstood you.
I wanted to run on librispeech test clean and see how it behaves when model is shared between process. Idea is to run inference on many process in the backend.

(Lissyx) #6

Yeah, but sharing memory between process will not work like when shared between threads.

(Kirill) #7

Agree with you. Thanks for help!
Do you think there will be any problems if I share model between threads?

(Lissyx) #8

Nope, this is something we do with success, so as long as your locking scheme is okay, it should work well. Yet, threads in Python can be tricky, sometimes :).

But we’ve got success with them with at least Rust and Java codebases.