h5py: Memory Leak when Slicing Dataset
To whom it may concern,
I recently ran into an issue where when I tried to slice a HDF5 dataset (similar to how I would slice a numpy array), my RAM kept filling up until I had to kill the program.
Here are the commands I entered in:
dataset = h5py.File(dataset_directory + recording_name)
print(dataset['3BData/Raw'][0:1000:2])
I basically tried to slice out every other element in the dataset, but this function never completed and it filled my entire RAM. Here are the dataset details:
<HDF5 dataset "Raw": shape (3224391680,), type "<u2">
Here are the specifications I am using:
python -c ‘import h5py; print(h5py.version.info)’ h5py 2.8.0 HDF5 1.10.2 Python 3.7.1 (default, Oct 23 2018, 19:19:42) [GCC 7.3.0] sys.platform linux sys.maxsize 9223372036854775807 numpy 1.15.3
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Comments: 19 (12 by maintainers)
@epourmal Good morning! 😃 Here you go: