Inject Noise in trainig data

--audio_aug_mix_noise_walk_dirs <directory1-contains-wav-files>,<directory2-contains-wav-files>

this flag is no longer available

I don’t think we ever had that. Can you be more specific on “no longer available”?

It used to be part of PR 2622, which is not merged. So yes, it was never available in the first place.

Yess I got to know about it from the above source,
Now I want to inject noise background in the training dataset, How could I do that

I have more one issue

I get my running output as this

And still going on…
why the steps are not stoping at 800 when i mentioned above

Moreover the model was already trained for libri-train-clean dataset then why am I getting High loss

Please check on Github: this flag is broken. I had a PR but it was not satisfactory. Feel free to work on that.

Please be precise: loss value depends on your datas, there’s no such thing as “high loss”.

So you are basing your jugement on this value (thank you, I can read), during the first epoch. Unless you clarify your expectations, my previous replies holds: this value is relative to your data, so I don’t see how you can judge it is “high”.

I meant to say that the checkpoint is training on dataset on which it has already been trained 100 epochs by mozilla and on same dataset it’s providing this much loss

It would be a great help, I want to apply augmentation on data, can you suggest best augmentation flags with its value to increase performance of my model.

Why do you think 67.269392 is high? What are you comparing it with?

It depends.

There are far too many factors involved to give anything other than the answer “It depends”.

i get this output for every step in test,
What does memory leak mean??

https://github.com/underworldcode/underworld2/issues/309

https://github.com/underworldcode/underworld2/commit/e7743828a4ae438f19b9e9a4a9655825944718e6

please check this relevancy

Is this issue irrelevant and nothing to worry about??

Can we stop loosing time in useless findings? Can you point the relevant lines? What is your question about? The reported memory leak? something else?

this is not something we reproduce, so I don’t know …