keras: memory leak when using tensorflow
Hello.
When using tensorflow, all ops are entered into the global tf graph. This results in memory leaks and loooong compilation times when building several models, one after the other, in the same python process (think ipython, cross validation, etc.)
For now, I solve this on my end by doing the following:
import keras.backend.tensorflow_backend
if keras.backend.tensorflow_backend._SESSION:
import tensorflow as tf
tf.reset_default_graph()
keras.backend.tensorflow_backend._SESSION.close()
keras.backend.tensorflow_backend._SESSION = None
Maybe we should incorporate this into a keras.reset() function?
About this issue
- Original URL
- State: closed
- Created 8 years ago
- Reactions: 18
- Comments: 52 (11 by maintainers)
You can now use
K.clear_session()
when using TensorFlow, which will clean up everything. This is recommended if you ever create models inside a loop.Here is sample code, and the results:
results:
notice compilation time and mem usage going up. After cleaning the default graph between iterations, these are the results:
it may work.
Now we are using the Keras 2.1.5 and the problem exists and does not get resolved by
K.crear_session()
I’m still seeing this issue with: TensorFlow Version: 1.13.1 TensorFlow.keras Version: 2.2.4-tf OS: Windows 10 TensorFlow-GPU running on: NVIDIA GTX 1080 ti
I’ve tried
tf.keras.backend.clear_session()
with no luck, still hitting RAM OOM errors eventually. I’ve also tried manually invoking garbage collection with no luck.I should note that
tf.keras.backend.clear_session()
does result in a visible drop in RAM, but the next call toModel.fit(...)
during looping, consumes more memory than was freed during the initial call totf.keras.backend.clear_session()
. I should also note that I am using TensorFlow datasets with one-shot iterators during training.I haven’t been able to pinpoint why this happens. But I know the problem occurs when I call
Model.fit(...)
on my Keras model with the two one-shot-iterators in a repeated loop. If i just initialize the one-shot iterators and don’t fit the Keras model (only compile the model) then the memory usage is uniform. As soon asModel.fit(...)
is called withtrain_ds.make_one_shot_iterator()
andval_ds.make_one_shot_iterator()
, I slowly leak RAM despite callingtf.keras.backend.clear_session()
at the beginning of the loop.Has anyone encountered this issue while directly fitting the Keras model to TensorFlow data generators? I’m trying not to downgrade too far due to the TensorFlow generator support in the more recent releases.
I’m working on an [mcve], but my code is still a bit lengthy to post.
I can confirm this problem with Keras 2.2.2 and Tensorflow 1.8.
I downgraded Keras to version 2.1.6, and the problem is gone.
Here is a pattern I adopted when fighting OOM that in retrospect may have caused OOM on its own:
I suspect that is why I was hitting OOM after my first del/clear_session(): deleting the model may deprive TF of info it needs to clear the session properly.
Now I am not reloading the model anyway, and the original OOM seems to be gone, maybe due to newer versions of everything. I’m not testing that ‘del model’ before clear_session() caused the latest memory leak, because it takes a while, but I recommend anyone using that sort of pattern try deleting things after the clear_session():
Beware of adoption becoming maladaptation. 😃
I think keras uses default session(probably), So I have set session manually and then called
K.clear_session()
which is working fine as below.Hi,
Try
from keras import backend as be (…) be.clear_session()
I run into OOM exceptions while using KerasClassifier to sweep large hyperparameter grids with TF backend. No problems with Theano.
Not exactly sure why this issue has been closed.
What can be done to mitigate the growing loading time when calling
load_model
sequentially?E.g. having ten different models that need to be loaded in memory, which means that using
clear_session()
is not an option here.We got the same problem in a loop for a sklearn kfold experiment. No problem switching to Theano.