keras: Error when load combined mobile-net model

I used code below to combine 2 mobile-net model. After combine i save model as combined.hdf5. Keras version: 2.2.0 Using TensorFlow backend. TensorFlow version: 1.8.0

model_A = load_model('mobilenet_A.hdf5', custom_objects={'relu6': mobilenet.relu6})
model_B = load_model('mobilenet_B.hdf5', custom_objects={'relu6': mobilenet.relu6})
inputs = Input(shape=(224, 224, 3))

pred_A = model_A(inputs)
pred_B = model_B(inputs)

pred_average = keras.layers.Average()([pred_A, pred_B])

model_combined = Model(inputs=inputs, outputs=pred_average)
model_combined.save('combined_test.hdf5')
model_x = load_model('combined_test.hdf5', custom_objects={'relu6': mobilenet.relu6})

When i load saved model, i get error below:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-23-3df9104863a8> in <module>()
      1 model_x = load_model('combined_test.hdf5', 
----> 2                      custom_objects={'relu6': mobilenet.relu6})

~/virtualenvs/AILab/lib/python3.5/site-packages/keras/engine/saving.py in load_model(filepath, custom_objects, compile)
    262 
    263         # set weights
--> 264         load_weights_from_hdf5_group(f['model_weights'], model.layers)
    265 
    266         if compile:

~/virtualenvs/AILab/lib/python3.5/site-packages/keras/engine/saving.py in load_weights_from_hdf5_group(f, layers, reshape)
    914                                                        original_keras_version,
    915                                                        original_backend,
--> 916                                                        reshape=reshape)
    917         if len(weight_values) != len(symbolic_weights):
    918             raise ValueError('Layer #' + str(k) +

~/virtualenvs/AILab/lib/python3.5/site-packages/keras/engine/saving.py in preprocess_weights_for_loading(layer, weights, original_keras_version, original_backend, reshape)
    555         weights = convert_nested_time_distributed(weights)
    556     elif layer.__class__.__name__ in ['Model', 'Sequential']:
--> 557         weights = convert_nested_model(weights)
    558 
    559     if original_keras_version == '1':

~/virtualenvs/AILab/lib/python3.5/site-packages/keras/engine/saving.py in convert_nested_model(weights)
    543                     weights=weights[:num_weights],
    544                     original_keras_version=original_keras_version,
--> 545                     original_backend=original_backend))
    546                 weights = weights[num_weights:]
    547         return new_weights

~/virtualenvs/AILab/lib/python3.5/site-packages/keras/engine/saving.py in preprocess_weights_for_loading(layer, weights, original_keras_version, original_backend, reshape)
    555         weights = convert_nested_time_distributed(weights)
    556     elif layer.__class__.__name__ in ['Model', 'Sequential']:
--> 557         weights = convert_nested_model(weights)
    558 
    559     if original_keras_version == '1':

~/virtualenvs/AILab/lib/python3.5/site-packages/keras/engine/saving.py in convert_nested_model(weights)
    531                     weights=weights[:num_weights],
    532                     original_keras_version=original_keras_version,
--> 533                     original_backend=original_backend))
    534                 weights = weights[num_weights:]
    535 

~/virtualenvs/AILab/lib/python3.5/site-packages/keras/engine/saving.py in preprocess_weights_for_loading(layer, weights, original_keras_version, original_backend, reshape)
    673             weights[0] = np.reshape(weights[0], layer_weights_shape)
    674         elif layer_weights_shape != weights[0].shape:
--> 675             weights[0] = np.transpose(weights[0], (3, 2, 0, 1))
    676             if layer.__class__.__name__ == 'ConvLSTM2D':
    677                 weights[1] = np.transpose(weights[1], (3, 2, 0, 1))

~/virtualenvs/AILab/lib/python3.5/site-packages/numpy/core/fromnumeric.py in transpose(a, axes)
    548 
    549     """
--> 550     return _wrapfunc(a, 'transpose', axes)
    551 
    552 

~/virtualenvs/AILab/lib/python3.5/site-packages/numpy/core/fromnumeric.py in _wrapfunc(obj, method, *args, **kwds)
     55 def _wrapfunc(obj, method, *args, **kwds):
     56     try:
---> 57         return getattr(obj, method)(*args, **kwds)
     58 
     59     # An AttributeError occurs if the object does not have

ValueError: axes don't match array

About this issue

  • Original URL
  • State: closed
  • Created 6 years ago
  • Reactions: 1
  • Comments: 17 (1 by maintainers)

Most upvoted comments

I believe this is related to #10784. When you have a nested model, there is a bug in loading trainable and untrainable weights (from BatchNorm layers?).

I just ran into this issue. I have a Siamese model that consists of two branch models and a head model. If I use a custom branch model that doesn’t contain BatchNorm, saving and loading the model works fine. But if I use something like ResNet18, I can’t load saved models because “axes don’t match array”.