keras: Keras model cannot be loaded if it contains a Lambda layer calling tf.image.resize_images
My Keras model cannot be loaded if it contains a Lambda layer that calls tf.image.resize_images. The exact same model without said Lambda layer loads just fine (see code below).
The model was saved using model.save() and according to the error log, when the model is trying to load, a call to func_load() in keras/utils/generic_utils.py isn’t passed the right arguments: “TypeError: arg 4 (defaults) must be None or tuple”.
Here is the entire log:
Using TensorFlow backend.
Traceback (most recent call last):
File "drive.py", line 83, in <module>
model = load_model(args.model)
File "/Users/alex/anaconda/envs/Keras/lib/python3.5/site-packages/keras/models.py", line 142, in load_model
model = model_from_config(model_config, custom_objects=custom_objects)
File "/Users/alex/anaconda/envs/Keras/lib/python3.5/site-packages/keras/models.py", line 193, in model_from_config
return layer_from_config(config, custom_objects=custom_objects)
File "/Users/alex/anaconda/envs/Keras/lib/python3.5/site-packages/keras/utils/layer_utils.py", line 42, in layer_from_config
return layer_class.from_config(config['config'])
File "/Users/alex/anaconda/envs/Keras/lib/python3.5/site-packages/keras/models.py", line 1090, in from_config
layer = get_or_create_layer(conf)
File "/Users/alex/anaconda/envs/Keras/lib/python3.5/site-packages/keras/models.py", line 1069, in get_or_create_layer
layer = layer_from_config(layer_data)
File "/Users/alex/anaconda/envs/Keras/lib/python3.5/site-packages/keras/utils/layer_utils.py", line 40, in layer_from_config
custom_objects=custom_objects)
File "/Users/alex/anaconda/envs/Keras/lib/python3.5/site-packages/keras/layers/core.py", line 682, in from_config
function = func_load(config['function'], globs=globs)
File "/Users/alex/anaconda/envs/Keras/lib/python3.5/site-packages/keras/utils/generic_utils.py", line 189, in func_load
closure=closure)
TypeError: arg 4 (defaults) must be None or tuple
and here is the part of my model that causes the issue. The first Lambda layer is the problem (without it the model loads fine):
model = Sequential()
model.add(Cropping2D(cropping=((cr_top, cr_bot), (cr_lef, cr_rig)),
input_shape=(in_row, in_col, ch)))
model.add(Lambda(tf.image.resize_images,
output_shape=(res_row, res_col, ch),
arguments={'size': (res_row, res_col)}))
model.add(Lambda(lambda x: x/127.5 - 1.,
output_shape=(res_row, res_col, ch)))
model.add(Convolution2D(24, 5, 5, subsample=(2, 2), border_mode="valid"))
model.add(BatchNormalization(axis=3, momentum=0.99))
model.add(ELU())
About this issue
- Original URL
- State: closed
- Created 7 years ago
- Comments: 29 (3 by maintainers)
Try using
custom_objects
argument in load_model (or model_from_json) function.For your case
load_model("model_path", custom_objects={"tf": tf})
after importingtensorflow as tf
would do the job.@lauphedo do you defined tf in the Lamba function? I found out if I import tensorflow as tf inside the function used by Lamba layer, and it is fine
got same error when loading a model contains Lambda
load_model(“model_path”, custom_objects={“tf”: tf}) did not work for me with keras 2.0.6 and python3.6
The problem is probably the extra Python lambda that you’re using. Just do this instead:
model.add(Lambda(resize_normalize, input_shape=(80,318,3), output_shape=(66, 200, 3)))
@lauphedo That’s another known issue, see #5088. One workaround is to import inside the lambda function.
(maybe) similar problem: trying to load a model that contains:
model_nspect2mask.add(TimeDistributed(Lambda(lambda x: tf.log(tf.abs(x))), input_shape = (None,feat_num)))
get error:
I working with keras-2.2.0 python3.6.4 @adwin5
if I import
tensorflow
astf
inside theRepeat_layer
, it will get errorcan't pickle module objects
when I save model viamodel.save
ormodel.to_json()
.@1524045patrick
I import
tensorflow
astf
outsideRepeat_layer
i.e., global scope. And load my model viaload_model("model_path", custom_objects={"tf": tf})
, it works for me.Is this issue fixed? Same issue in TF. 2.x
Yes, as @antonimmo says, using
custom_objects
works, and it works for any other things thatload_model
complains about not being found e.g. custom loss functions.Using tf version 2.4 I still have this problem
@adwin5, @allanzelener, @lauphedo ,
for me that workaround did not work when I am using the lamda layer like this:
and my function looks like this. Itried both tensorflow and keras backend import explicite inside the function:
When I load my model I got the following traceback:
@adwin5, @allanzelener Wow, this was fast and really helpful! Thank you very much! Before, I couln’t understand #5088
Interestingly, load_weights() performs well without the mentioned fix…
I have a very similar error with a
Lambda
layer. I am working with complex numbers and at some point need to consider the real part of my tensors.Here is a minimal failing example:
Which fails with:
The error is very cryptic because it points only to Keras lines, whereas the code areas referred to are actually in TensorFlow… That’s a first problem, but also as you can see using the
custom_objects
kwarg, like @antonimmo suggested, doesn’t help fixing the problem.Any ideas on how to solve this? (as a side note, it’s not actually too much of a problem since I can just recreate the model and load the weights but it would be more convenient if I could just load the model).
Thx alot for commenting this question. So meanwhile I fixed it.
def architecture():
… …
I’ve run into the same issue, here’s a small gist that reproduces the error.
A different error occurs when deserializing from yaml instead of json. I couldn’t get a Lambda wrapper for K.resize_images to work either.
Edit: The JSON errors seems to be because
defaults
is serialized as[None]
but is expected to be eitherNone
or a tuple but not a list. I’ve added PR #5350 to resolve this.The
NameError
issue is referenced in #5088 and has a workaround. Simply import what’s needed in a wrapped function. This gist fixes my example above.