Merge multiple hdf5 caches

I faced some problems creating one large hdf5 cache from all the training data, so I created multiple small hdf5 files. Now the problem is, the DeepSpeech.py script accepts only one path for hdf5 cache. So, is there any easy way to merge multiple hdf5 cache files so it can be used for training?
Thanks

You can use pandas.concat to append the DataFrames but if the problems you ran into were due to running out of memory, that will probably just lead to the same problems again. The best/easiest way to handle this is probably to wrap the Dataset.data attribute in a generator that loads the different HDF5 files lazily, but it’s not a trivial change. If you get working, please send us a PR!