h5py: OSError: Unable to open file (unable to lock file, errno = 37, error message = 'No locks available')
i got an error like title,i try to use HDF5_USE_FILE_LOCKING=FALSE in bashrc file,but it doesn’t work,what should i do? my code is following that:
import h5py
h5py.File('X.h5','r')
About this issue
- Original URL
- State: open
- Created 6 years ago
- Comments: 40 (7 by maintainers)
This worked for me. I set the environment variable HDF5_USE_FILE_LOCKING to ‘FALSE’.
Added the following line to the .bashrc file and saved it: export HDF5_USE_FILE_LOCKING=‘FALSE’
$ cd ~$ nano .bashrc(write in the .bashrc file => export HDF5_USE_FILE_LOCKING=‘FALSE’)
$ source .bashrcHas anyone found a solution to this? Initially `HDF5_USE_FILE_LOCKING=‘FALSE’ worked but for some reason the error has returned
This worked for me. Thanks.
Hi, in case anyone else faces this same issue in the future, it can also be solved by adding
locking=Falseas a parameter to theh5py.Filefunction, i.e.h5py.File('X.h5','r',locking=False)Repeating message from @anjandash
And it worked for me for old (2.10.0) and new (3.6) versionsions of
h5py.I set the environment variable HDF5_USE_FILE_LOCKING to ‘FALSE’.
Added the following line to the .bashrc file and saved it: export HDF5_USE_FILE_LOCKING=‘FALSE’
$ cd ~ $ nano .bashrc (write in the .bashrc file => export HDF5_USE_FILE_LOCKING=‘FALSE’) $ source .bashrc
If the user is using bash, then add
export HDF5_USE_FILE_LOCKING='FALSE'to the .bashrc. If the user is using zsh shell instead, they should addexport HDF5_USE_FILE_LOCKING='FALSE'to their .zshrc fileI recently updated numpy, matplotlib, astropy and h5py to the latest releases as of 1.03.24. Before this update, I had the locking issue and was using
os.environ["HDF5_USE_FILE_LOCKING"]="FALSE"orexport HDF5_USE_FILE_LOCKING='FALSE'to solve this issue but after updating it returned again. And was NOT resolved by changing theexport HDF5_USE_FILE_LOCKING='FALSE'flag.By just adding a flag
locking=Falseto theh5py.File()function as per the comment, it is solved.I use MAC 14.3.1 VScode 1.86.2 h5py.version.hdf5_version: 1.14.2 h5py.version.version: 3.10.0
TLDR; if setting
export HDF5_USE_FILE_LOCKING='FALSE'does not work, useh5py.File('X.h5','r',locking=False)The various Linux file systems (ext4, etc.) have been tested intensively several times a year since distro inceptions more than 20 years ago. My own experience has been error-free until I started using
h5py.I routinely use
HDF5_USE_FILE_LOCKING='FALSE'without an issue. But the Python packages lack the focus of QA teams so once in awhile after I’ve mucked with the packages (tempting fate, perhaps), I’ve seen the symptom reported by this issue. Then, I always follow this one-time procedure before resuming testing Python with h5 files:That works every time to restore normalcy for me.
My testing environment:
I have the same issue in a Keras program where one Keras callback writes the h5 file and a following one reads it. Keras callbacks run serially. The issue is more prevalent on slow filesystems (e.g. Docker overlay filesystems) so that leads me to believe it is due to a race condition in Linux filesystem lock release (i.e. the file is flushed and written but the lock is not released before the process exits). I have also noted a similar race condition with writing a file and Linux setting its permissions to executable (again - an executable file is written but its permissions are not made executable quickly enough on overlay filesystems for the next process to run it). In both cases solution was to manually poll the status of file before attempting to use it.
If linux FS is so broken I think library developers need to manually manage the locks by releasing them explicitly rather than relying on OS to release them during process clean up.