h5py: OSError: Unable to open file (unable to lock file, errno = 37, error message = 'No locks available')

i got an error like title,i try to use HDF5_USE_FILE_LOCKING=FALSE in bashrc file,but it doesn’t work,what should i do? my code is following that:

import h5py
h5py.File('X.h5','r')

About this issue

  • Original URL
  • State: open
  • Created 6 years ago
  • Comments: 40 (7 by maintainers)

Most upvoted comments

Has anyone found a solution to this? Initially `HDF5_USE_FILE_LOCKING=‘FALSE’ worked but for some reason the error has returned

This worked for me. I set the environment variable HDF5_USE_FILE_LOCKING to ‘FALSE’.

Added the following line to the .bashrc file and saved it: export HDF5_USE_FILE_LOCKING=‘FALSE’

$ cd ~ $ nano .bashrc
(write in the .bashrc file => export HDF5_USE_FILE_LOCKING=‘FALSE’) $ source .bashrc

Has anyone found a solution to this? Initially `HDF5_USE_FILE_LOCKING=‘FALSE’ worked but for some reason the error has returned

Has anyone found a solution to this? Initially `HDF5_USE_FILE_LOCKING=‘FALSE’ worked but for some reason the error has returned

This worked for me. Thanks.

Hi, in case anyone else faces this same issue in the future, it can also be solved by adding locking=False as a parameter to the h5py.File function, i.e. h5py.File('X.h5','r',locking=False)

Repeating message from @anjandash

And it worked for me for old (2.10.0) and new (3.6) versionsions of h5py.

I set the environment variable HDF5_USE_FILE_LOCKING to ‘FALSE’.

Added the following line to the .bashrc file and saved it: export HDF5_USE_FILE_LOCKING=‘FALSE’

$ cd ~ $ nano .bashrc (write in the .bashrc file => export HDF5_USE_FILE_LOCKING=‘FALSE’) $ source .bashrc

Has anyone found a solution to this? Initially `HDF5_USE_FILE_LOCKING=‘FALSE’ worked but for some reason the error has returned

This worked for me. I set the environment variable HDF5_USE_FILE_LOCKING to ‘FALSE’.

Added the following line to the .bashrc file and saved it: export HDF5_USE_FILE_LOCKING=‘FALSE’

$ cd ~ $ nano .bashrc (write in the .bashrc file => export HDF5_USE_FILE_LOCKING=‘FALSE’) $ source .bashrc

If the user is using bash, then add export HDF5_USE_FILE_LOCKING='FALSE' to the .bashrc. If the user is using zsh shell instead, they should add export HDF5_USE_FILE_LOCKING='FALSE' to their .zshrc file

Hi, in case anyone else faces this same issue in the future, it can also be solved by adding locking=False as a parameter to the h5py.File function, i.e. h5py.File('X.h5','r',locking=False)

I recently updated numpy, matplotlib, astropy and h5py to the latest releases as of 1.03.24. Before this update, I had the locking issue and was using os.environ["HDF5_USE_FILE_LOCKING"]="FALSE" or export HDF5_USE_FILE_LOCKING='FALSE' to solve this issue but after updating it returned again. And was NOT resolved by changing the export HDF5_USE_FILE_LOCKING='FALSE' flag.

By just adding a flag locking=False to the h5py.File() function as per the comment, it is solved.

I use MAC 14.3.1 VScode 1.86.2 h5py.version.hdf5_version: 1.14.2 h5py.version.version: 3.10.0

TLDR; if setting export HDF5_USE_FILE_LOCKING='FALSE' does not work, use h5py.File('X.h5','r',locking=False)

The various Linux file systems (ext4, etc.) have been tested intensively several times a year since distro inceptions more than 20 years ago. My own experience has been error-free until I started using h5py.

I routinely use HDF5_USE_FILE_LOCKING='FALSE' without an issue. But the Python packages lack the focus of QA teams so once in awhile after I’ve mucked with the packages (tempting fate, perhaps), I’ve seen the symptom reported by this issue. Then, I always follow this one-time procedure before resuming testing Python with h5 files:

python3 -m pip uninstall -y h5py hdf5plugin
python3 -m pip cache purge
python3 -m pip install -y h5py hdf5plugin --user # I keep my Python packages under $HOME/.local

That works every time to restore normalcy for me.

My testing environment:

----- Ubuntu release -----
Description:	Ubuntu 22.04.2 LTS
Release:	22.04
Codename:	jammy
----- OS kernel name/release/version -----
5.19.0-38-generic #39~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Fri Mar 17 21:16:15 UTC 2
----- Machine hardware platform -----
x86_64

I have the same issue in a Keras program where one Keras callback writes the h5 file and a following one reads it. Keras callbacks run serially. The issue is more prevalent on slow filesystems (e.g. Docker overlay filesystems) so that leads me to believe it is due to a race condition in Linux filesystem lock release (i.e. the file is flushed and written but the lock is not released before the process exits). I have also noted a similar race condition with writing a file and Linux setting its permissions to executable (again - an executable file is written but its permissions are not made executable quickly enough on overlay filesystems for the next process to run it). In both cases solution was to manually poll the status of file before attempting to use it.

If linux FS is so broken I think library developers need to manually manage the locks by releasing them explicitly rather than relying on OS to release them during process clean up.