keras: weights saved/loaded incorrectly when weights frozen
Given this code:
vgg = VGG16(weights='imagenet', include_top=False, input_shape=(360, 480, 3))
for layer in vgg.layers[:-4]:
layer.trainable = False
inputs = Input(shape=(360, 480, 3))
encoder = vgg(inputs)
encoder = Flatten()(encoder)
encoder = Dense(16)(encoder)
encoder = Model(inputs, encoder)
in1 = Input(shape=(360, 480, 3))
comparator = Model(in1, encoder(in1))
comparator.save_weights('h.h5')
comparator.load_weights('h.h5')
I get this error:
---------------------------------------------------------------------------
IndexError Traceback (most recent call last)
<ipython-input-54-a45f35902442> in <module>()
16
17 comparator.save_weights('h.h5')
---> 18 comparator.load_weights('h.h5')
c:\users\seanh\appdata\local\programs\python\python36\lib\site-packages\keras\engine\network.py in load_weights(self, filepath, by_name, skip_mismatch, reshape)
1178 else:
1179 saving.load_weights_from_hdf5_group(
-> 1180 f, self.layers, reshape=reshape)
1181
1182 def _updated_config(self):
c:\users\seanh\appdata\local\programs\python\python36\lib\site-packages\keras\engine\saving.py in load_weights_from_hdf5_group(f, layers, reshape)
914 original_keras_version,
915 original_backend,
--> 916 reshape=reshape)
917 if len(weight_values) != len(symbolic_weights):
918 raise ValueError('Layer #' + str(k) +
c:\users\seanh\appdata\local\programs\python\python36\lib\site-packages\keras\engine\saving.py in preprocess_weights_for_loading(layer, weights, original_keras_version, original_backend, reshape)
555 weights = convert_nested_time_distributed(weights)
556 elif layer.__class__.__name__ in ['Model', 'Sequential']:
--> 557 weights = convert_nested_model(weights)
558
559 if original_keras_version == '1':
c:\users\seanh\appdata\local\programs\python\python36\lib\site-packages\keras\engine\saving.py in convert_nested_model(weights)
531 weights=weights[:num_weights],
532 original_keras_version=original_keras_version,
--> 533 original_backend=original_backend))
534 weights = weights[num_weights:]
535
c:\users\seanh\appdata\local\programs\python\python36\lib\site-packages\keras\engine\saving.py in preprocess_weights_for_loading(layer, weights, original_keras_version, original_backend, reshape)
555 weights = convert_nested_time_distributed(weights)
556 elif layer.__class__.__name__ in ['Model', 'Sequential']:
--> 557 weights = convert_nested_model(weights)
558
559 if original_keras_version == '1':
c:\users\seanh\appdata\local\programs\python\python36\lib\site-packages\keras\engine\saving.py in convert_nested_model(weights)
543 weights=weights[:num_weights],
544 original_keras_version=original_keras_version,
--> 545 original_backend=original_backend))
546 weights = weights[num_weights:]
547 return new_weights
c:\users\seanh\appdata\local\programs\python\python36\lib\site-packages\keras\engine\saving.py in preprocess_weights_for_loading(layer, weights, original_keras_version, original_backend, reshape)
672 str(weights[0].size) + '. ')
673 weights[0] = np.reshape(weights[0], layer_weights_shape)
--> 674 elif layer_weights_shape != weights[0].shape:
675 weights[0] = np.transpose(weights[0], (3, 2, 0, 1))
676 if layer.__class__.__name__ == 'ConvLSTM2D':
IndexError: list index out of range
but if the trainable = False
part is taken out:
vgg = VGG16(weights='imagenet', include_top=False, input_shape=(360, 480, 3))
# for layer in vgg.layers[:-4]:
# layer.trainable = False
inputs = Input(shape=(360, 480, 3))
encoder = vgg(inputs)
encoder = Flatten()(encoder)
encoder = Dense(16)(encoder)
encoder = Model(inputs, encoder)
in1 = Input(shape=(360, 480, 3))
comparator = Model(in1, encoder(in1))
comparator.save_weights('h.h5')
comparator.load_weights('h.h5')
No error occurs and it works. It also works if the comparator
is omitted and just the encoder
is saved and loaded:
vgg = VGG16(weights='imagenet', include_top=False, input_shape=(360, 480, 3))
for layer in vgg.layers[:-4]:
layer.trainable = False
inputs = Input(shape=(360, 480, 3))
encoder = vgg(inputs)
encoder = Flatten()(encoder)
encoder = Dense(16)(encoder)
encoder = Model(inputs, encoder)
encoder.save_weights('h.h5')
encoder.load_weights('h.h5')
It seems like in the error case it is failing to save/load the nontrainable weights, but I don’t see why that would be the case.
To put it into a question, what is the cause of this error and how can I get around it without skipping the comparator?
About this issue
- Original URL
- State: closed
- Created 6 years ago
- Comments: 19
I cannot make the code provided by @raymond-yuan work. It still ends up with IndexError. Would you how to fix it?
Keras 2.2.4 Tensorflow 1.13.0-rc0 Python 3.6
@Sean-Hastings This issue should not be closed because, although the problem might be solved in the TensorFlow Keras fork, it is still an issue in the original Keras code.
I have the same versions as zikaadam and the same issue. I can’t load models that were saved with frozen weights using tf.keras.
I’m using tensorflow as well, can you try doing with tf.keras? This doesn’t seem to have any issues: