keras: ValueError: Error when checking target: expected dense_14 to have shape (None, 2) but got array with shape (928, 1)

I am working through the keras transfer learning tutorial here : https://blog.keras.io/building-powerful-image-classification-models-using-very-little-data.html using Keras with a tensorflow backend. My data is made up training data (499 and 443 images of class 0 and 1) and validation data (101 and 103 image of class 0 and 1)

When I try and run the block of code below I receive the error

ValueError: Error when checking target: expected dense_14 to have shape (None, 2) but got array with shape (928, 1)

My understanding of the structure is that my input of (928,4,4,512) which sets the input shape to the flatten_9 layer at (none, 8192) but I am confused as why is causes and error at the dense_14 layer as the size of the hidden layers is already defined?

my model configuration is

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=======================================
flatten_9 (Flatten)          (None, 8192)              0         
_________________________________________________________________
dense_13 (Dense)             (None, 256)               2097408   
_________________________________________________________________
dropout_7 (Dropout)          (None, 256)               0         
_________________________________________________________________
dense_14 (Dense)             (None, 2)                 514       
========================================
def train_top_model():
    train_data = np.load(open('bottleneck_features_train.npy','rb'))
    train_labels = np.array(
        [0] * 499 + [1] * 443)

    validation_data = np.load(open('bottleneck_features_validation.npy','rb'))
    validation_labels = np.array(
        [0] * 101 + [1] * 103)
    
    model = Sequential()
    model.add(Flatten(input_shape=train_data.shape[1:]))
    model.add(Dense(256, activation='relu'))
    model.add(Dropout(0.5))
    model.add(Dense(2, activation='sigmoid'))

    model.compile(optimizer='rmsprop',
                  loss='binary_crossentropy', metrics=['accuracy'])

    model.fit(train_data, train_labels,
              epochs=epochs,
              batch_size=batch_size,
              validation_data=(validation_data, validation_labels))
    model.save_weights(top_model_weights_path)

About this issue

  • Original URL
  • State: closed
  • Created 7 years ago
  • Reactions: 7
  • Comments: 22

Most upvoted comments

use “keras.utils.np_utils.to_categorical” to convert your train_labels to categorical one-hot vectors.

In my case, the loss parameter in the compiling the model was specified as “sparse_categorical_crossentropy”. When I changed it to “categorical_crossentropy”, the error was fixed.

I have similar problems, this is my code my train images dataset images belonging to 10 classes. my test/validation dataset is 7542 images belonging to 10 classes.

import glob import matplotlib.pyplot as plt

from keras.applications.inception_v3 import InceptionV3,preprocess_input from keras.preprocessing.image import ImageDataGenerator from keras.optimizers import SGD from keras.models import Model from keras.layers import Dense,GlobalAveragePooling2D from keras.preprocessing.image import ImageDataGenerator

import os os.environ[‘TF_CPP_MIN_LOG_LEVEL’] = ‘2’

def get_num_files(path): if not os.path.exists(path): return 0 return sum([len(files) for r,d,files in os.walk(path)])

def get_num_subfolders(path): if not os.path.exists(path): return 0 return sum([len(files) for r,d,files in os.walk(path)])

def create_img_generator(): return ImageDataGenerator( preprocessing_function = preprocess_input, rotation_range=30, width_shift_range = 0.2, height_shift_range=0.2, shear_range = 0.2, zoom_range=0.2, horizontal_flip=True)

Image_width,Image_height = 299,299 Training_Epochs = 2 Batch_Size = 32 Number_FC_Neurons = 1024

train_dir = r’C:\Users\User\Desktop\tomato\train’ validate_dir = r’C:\Users\User\Desktop\tomato\validate’

num_train_samples = get_num_files(train_dir) num_classes = get_num_subfolders(train_dir) num_validate_samples = get_num_files(validate_dir)

num_epoch = Training_Epochs batch_size = Batch_Size

train_image_gen = create_img_generator() test_image_gen = create_img_generator()

train_generator = train_image_gen.flow_from_directory( train_dir, target_size=(Image_width,Image_height), batch_size=batch_size, seed=42 )

validation_generator = train_image_gen.flow_from_directory( validate_dir, target_size=(Image_width,Image_height), batch_size=batch_size, seed=42 ) InceptionV3_base_model = InceptionV3(weights=‘imagenet’,include_top = False) print(‘Inception v3 base model without last FC loaded’) x=InceptionV3_base_model.output x= GlobalAveragePooling2D()(x) x=Dense(Number_FC_Neurons,activation=‘relu’)(x) predictions = Dense(num_classes,activation=‘softmax’)(x)

model=Model(inputs=InceptionV3_base_model.input,outputs=predictions)

print(model.summary())

Layer (type) Output Shape Param # Connected to

input_3 (InputLayer) (None, None, None, 3 0

conv2d_189 (Conv2D) (None, None, None, 3 864 input_3[0][0]

batch_normalization_189 (BatchN (None, None, None, 3 96 conv2d_189[0][0]

activation_189 (Activation) (None, None, None, 3 0 batch_normalization_189[0][0]

conv2d_190 (Conv2D) (None, None, None, 3 9216 activation_189[0][0]

batch_normalization_190 (BatchN (None, None, None, 3 96 conv2d_190[0][0]

activation_190 (Activation) (None, None, None, 3 0 batch_normalization_190[0][0]

conv2d_191 (Conv2D) (None, None, None, 6 18432 activation_190[0][0]

batch_normalization_191 (BatchN (None, None, None, 6 192 conv2d_191[0][0]

activation_191 (Activation) (None, None, None, 6 0 batch_normalization_191[0][0]

max_pooling2d_9 (MaxPooling2D) (None, None, None, 6 0 activation_191[0][0]

conv2d_192 (Conv2D) (None, None, None, 8 5120 max_pooling2d_9[0][0]

batch_normalization_192 (BatchN (None, None, None, 8 240 conv2d_192[0][0]

activation_192 (Activation) (None, None, None, 8 0 batch_normalization_192[0][0]

conv2d_193 (Conv2D) (None, None, None, 1 138240 activation_192[0][0]

batch_normalization_193 (BatchN (None, None, None, 1 576 conv2d_193[0][0]

activation_193 (Activation) (None, None, None, 1 0 batch_normalization_193[0][0]

max_pooling2d_10 (MaxPooling2D) (None, None, None, 1 0 activation_193[0][0]

conv2d_197 (Conv2D) (None, None, None, 6 12288 max_pooling2d_10[0][0]

batch_normalization_197 (BatchN (None, None, None, 6 192 conv2d_197[0][0]

activation_197 (Activation) (None, None, None, 6 0 batch_normalization_197[0][0]

conv2d_195 (Conv2D) (None, None, None, 4 9216 max_pooling2d_10[0][0]

conv2d_198 (Conv2D) (None, None, None, 9 55296 activation_197[0][0]

batch_normalization_195 (BatchN (None, None, None, 4 144 conv2d_195[0][0]

batch_normalization_198 (BatchN (None, None, None, 9 288 conv2d_198[0][0]

activation_195 (Activation) (None, None, None, 4 0 batch_normalization_195[0][0]

activation_198 (Activation) (None, None, None, 9 0 batch_normalization_198[0][0]

average_pooling2d_19 (AveragePo (None, None, None, 1 0 max_pooling2d_10[0][0]

conv2d_194 (Conv2D) (None, None, None, 6 12288 max_pooling2d_10[0][0]

conv2d_196 (Conv2D) (None, None, None, 6 76800 activation_195[0][0]

conv2d_199 (Conv2D) (None, None, None, 9 82944 activation_198[0][0]

conv2d_200 (Conv2D) (None, None, None, 3 6144 average_pooling2d_19[0][0]

batch_normalization_194 (BatchN (None, None, None, 6 192 conv2d_194[0][0]

batch_normalization_196 (BatchN (None, None, None, 6 192 conv2d_196[0][0]

batch_normalization_199 (BatchN (None, None, None, 9 288 conv2d_199[0][0]

batch_normalization_200 (BatchN (None, None, None, 3 96 conv2d_200[0][0]

activation_194 (Activation) (None, None, None, 6 0 batch_normalization_194[0][0]

activation_196 (Activation) (None, None, None, 6 0 batch_normalization_196[0][0]

activation_199 (Activation) (None, None, None, 9 0 batch_normalization_199[0][0]

activation_200 (Activation) (None, None, None, 3 0 batch_normalization_200[0][0]

mixed0 (Concatenate) (None, None, None, 2 0 activation_194[0][0] activation_196[0][0] activation_199[0][0] activation_200[0][0]

conv2d_204 (Conv2D) (None, None, None, 6 16384 mixed0[0][0]

batch_normalization_204 (BatchN (None, None, None, 6 192 conv2d_204[0][0]

activation_204 (Activation) (None, None, None, 6 0 batch_normalization_204[0][0]

conv2d_202 (Conv2D) (None, None, None, 4 12288 mixed0[0][0]

conv2d_205 (Conv2D) (None, None, None, 9 55296 activation_204[0][0]

batch_normalization_202 (BatchN (None, None, None, 4 144 conv2d_202[0][0]

batch_normalization_205 (BatchN (None, None, None, 9 288 conv2d_205[0][0]

activation_202 (Activation) (None, None, None, 4 0 batch_normalization_202[0][0]

activation_205 (Activation) (None, None, None, 9 0 batch_normalization_205[0][0]

average_pooling2d_20 (AveragePo (None, None, None, 2 0 mixed0[0][0]

conv2d_201 (Conv2D) (None, None, None, 6 16384 mixed0[0][0]

conv2d_203 (Conv2D) (None, None, None, 6 76800 activation_202[0][0]

conv2d_206 (Conv2D) (None, None, None, 9 82944 activation_205[0][0]

conv2d_207 (Conv2D) (None, None, None, 6 16384 average_pooling2d_20[0][0]

batch_normalization_201 (BatchN (None, None, None, 6 192 conv2d_201[0][0]

batch_normalization_203 (BatchN (None, None, None, 6 192 conv2d_203[0][0]

batch_normalization_206 (BatchN (None, None, None, 9 288 conv2d_206[0][0]

batch_normalization_207 (BatchN (None, None, None, 6 192 conv2d_207[0][0]

activation_201 (Activation) (None, None, None, 6 0 batch_normalization_201[0][0]

activation_203 (Activation) (None, None, None, 6 0 batch_normalization_203[0][0]

activation_206 (Activation) (None, None, None, 9 0 batch_normalization_206[0][0]

activation_207 (Activation) (None, None, None, 6 0 batch_normalization_207[0][0]

mixed1 (Concatenate) (None, None, None, 2 0 activation_201[0][0] activation_203[0][0] activation_206[0][0] activation_207[0][0]

conv2d_211 (Conv2D) (None, None, None, 6 18432 mixed1[0][0]

batch_normalization_211 (BatchN (None, None, None, 6 192 conv2d_211[0][0]

activation_211 (Activation) (None, None, None, 6 0 batch_normalization_211[0][0]

conv2d_209 (Conv2D) (None, None, None, 4 13824 mixed1[0][0]

conv2d_212 (Conv2D) (None, None, None, 9 55296 activation_211[0][0]

batch_normalization_209 (BatchN (None, None, None, 4 144 conv2d_209[0][0]

batch_normalization_212 (BatchN (None, None, None, 9 288 conv2d_212[0][0]

activation_209 (Activation) (None, None, None, 4 0 batch_normalization_209[0][0]

activation_212 (Activation) (None, None, None, 9 0 batch_normalization_212[0][0]

average_pooling2d_21 (AveragePo (None, None, None, 2 0 mixed1[0][0]

conv2d_208 (Conv2D) (None, None, None, 6 18432 mixed1[0][0]

conv2d_210 (Conv2D) (None, None, None, 6 76800 activation_209[0][0]

conv2d_213 (Conv2D) (None, None, None, 9 82944 activation_212[0][0]

conv2d_214 (Conv2D) (None, None, None, 6 18432 average_pooling2d_21[0][0]

batch_normalization_208 (BatchN (None, None, None, 6 192 conv2d_208[0][0]

batch_normalization_210 (BatchN (None, None, None, 6 192 conv2d_210[0][0]

batch_normalization_213 (BatchN (None, None, None, 9 288 conv2d_213[0][0]

batch_normalization_214 (BatchN (None, None, None, 6 192 conv2d_214[0][0]

activation_208 (Activation) (None, None, None, 6 0 batch_normalization_208[0][0]

activation_210 (Activation) (None, None, None, 6 0 batch_normalization_210[0][0]

activation_213 (Activation) (None, None, None, 9 0 batch_normalization_213[0][0]

activation_214 (Activation) (None, None, None, 6 0 batch_normalization_214[0][0]

mixed2 (Concatenate) (None, None, None, 2 0 activation_208[0][0] activation_210[0][0] activation_213[0][0] activation_214[0][0]

conv2d_216 (Conv2D) (None, None, None, 6 18432 mixed2[0][0]

batch_normalization_216 (BatchN (None, None, None, 6 192 conv2d_216[0][0]

activation_216 (Activation) (None, None, None, 6 0 batch_normalization_216[0][0]

conv2d_217 (Conv2D) (None, None, None, 9 55296 activation_216[0][0]

batch_normalization_217 (BatchN (None, None, None, 9 288 conv2d_217[0][0]

activation_217 (Activation) (None, None, None, 9 0 batch_normalization_217[0][0]

conv2d_215 (Conv2D) (None, None, None, 3 995328 mixed2[0][0]

conv2d_218 (Conv2D) (None, None, None, 9 82944 activation_217[0][0]

batch_normalization_215 (BatchN (None, None, None, 3 1152 conv2d_215[0][0]

batch_normalization_218 (BatchN (None, None, None, 9 288 conv2d_218[0][0]

activation_215 (Activation) (None, None, None, 3 0 batch_normalization_215[0][0]

activation_218 (Activation) (None, None, None, 9 0 batch_normalization_218[0][0]

max_pooling2d_11 (MaxPooling2D) (None, None, None, 2 0 mixed2[0][0]

mixed3 (Concatenate) (None, None, None, 7 0 activation_215[0][0] activation_218[0][0] max_pooling2d_11[0][0]

conv2d_223 (Conv2D) (None, None, None, 1 98304 mixed3[0][0]

batch_normalization_223 (BatchN (None, None, None, 1 384 conv2d_223[0][0]

activation_223 (Activation) (None, None, None, 1 0 batch_normalization_223[0][0]

conv2d_224 (Conv2D) (None, None, None, 1 114688 activation_223[0][0]

batch_normalization_224 (BatchN (None, None, None, 1 384 conv2d_224[0][0]

activation_224 (Activation) (None, None, None, 1 0 batch_normalization_224[0][0]

conv2d_220 (Conv2D) (None, None, None, 1 98304 mixed3[0][0]

conv2d_225 (Conv2D) (None, None, None, 1 114688 activation_224[0][0]

batch_normalization_220 (BatchN (None, None, None, 1 384 conv2d_220[0][0]

batch_normalization_225 (BatchN (None, None, None, 1 384 conv2d_225[0][0]

activation_220 (Activation) (None, None, None, 1 0 batch_normalization_220[0][0]

activation_225 (Activation) (None, None, None, 1 0 batch_normalization_225[0][0]

conv2d_221 (Conv2D) (None, None, None, 1 114688 activation_220[0][0]

conv2d_226 (Conv2D) (None, None, None, 1 114688 activation_225[0][0]

batch_normalization_221 (BatchN (None, None, None, 1 384 conv2d_221[0][0]

batch_normalization_226 (BatchN (None, None, None, 1 384 conv2d_226[0][0]

activation_221 (Activation) (None, None, None, 1 0 batch_normalization_221[0][0]

activation_226 (Activation) (None, None, None, 1 0 batch_normalization_226[0][0]

average_pooling2d_22 (AveragePo (None, None, None, 7 0 mixed3[0][0]

conv2d_219 (Conv2D) (None, None, None, 1 147456 mixed3[0][0]

conv2d_222 (Conv2D) (None, None, None, 1 172032 activation_221[0][0]

conv2d_227 (Conv2D) (None, None, None, 1 172032 activation_226[0][0]

conv2d_228 (Conv2D) (None, None, None, 1 147456 average_pooling2d_22[0][0]

batch_normalization_219 (BatchN (None, None, None, 1 576 conv2d_219[0][0]

batch_normalization_222 (BatchN (None, None, None, 1 576 conv2d_222[0][0]

batch_normalization_227 (BatchN (None, None, None, 1 576 conv2d_227[0][0]

batch_normalization_228 (BatchN (None, None, None, 1 576 conv2d_228[0][0]

activation_219 (Activation) (None, None, None, 1 0 batch_normalization_219[0][0]

activation_222 (Activation) (None, None, None, 1 0 batch_normalization_222[0][0]

activation_227 (Activation) (None, None, None, 1 0 batch_normalization_227[0][0]

activation_228 (Activation) (None, None, None, 1 0 batch_normalization_228[0][0]

mixed4 (Concatenate) (None, None, None, 7 0 activation_219[0][0] activation_222[0][0] activation_227[0][0] activation_228[0][0]

conv2d_233 (Conv2D) (None, None, None, 1 122880 mixed4[0][0]

batch_normalization_233 (BatchN (None, None, None, 1 480 conv2d_233[0][0]

activation_233 (Activation) (None, None, None, 1 0 batch_normalization_233[0][0]

conv2d_234 (Conv2D) (None, None, None, 1 179200 activation_233[0][0]

batch_normalization_234 (BatchN (None, None, None, 1 480 conv2d_234[0][0]

activation_234 (Activation) (None, None, None, 1 0 batch_normalization_234[0][0]

conv2d_230 (Conv2D) (None, None, None, 1 122880 mixed4[0][0]

conv2d_235 (Conv2D) (None, None, None, 1 179200 activation_234[0][0]

batch_normalization_230 (BatchN (None, None, None, 1 480 conv2d_230[0][0]

batch_normalization_235 (BatchN (None, None, None, 1 480 conv2d_235[0][0]

activation_230 (Activation) (None, None, None, 1 0 batch_normalization_230[0][0]

activation_235 (Activation) (None, None, None, 1 0 batch_normalization_235[0][0]

conv2d_231 (Conv2D) (None, None, None, 1 179200 activation_230[0][0]

conv2d_236 (Conv2D) (None, None, None, 1 179200 activation_235[0][0]

batch_normalization_231 (BatchN (None, None, None, 1 480 conv2d_231[0][0]

batch_normalization_236 (BatchN (None, None, None, 1 480 conv2d_236[0][0]

activation_231 (Activation) (None, None, None, 1 0 batch_normalization_231[0][0]

activation_236 (Activation) (None, None, None, 1 0 batch_normalization_236[0][0]

average_pooling2d_23 (AveragePo (None, None, None, 7 0 mixed4[0][0]

conv2d_229 (Conv2D) (None, None, None, 1 147456 mixed4[0][0]

conv2d_232 (Conv2D) (None, None, None, 1 215040 activation_231[0][0]

conv2d_237 (Conv2D) (None, None, None, 1 215040 activation_236[0][0]

conv2d_238 (Conv2D) (None, None, None, 1 147456 average_pooling2d_23[0][0]

batch_normalization_229 (BatchN (None, None, None, 1 576 conv2d_229[0][0]

batch_normalization_232 (BatchN (None, None, None, 1 576 conv2d_232[0][0]

batch_normalization_237 (BatchN (None, None, None, 1 576 conv2d_237[0][0]

batch_normalization_238 (BatchN (None, None, None, 1 576 conv2d_238[0][0]

activation_229 (Activation) (None, None, None, 1 0 batch_normalization_229[0][0]

activation_232 (Activation) (None, None, None, 1 0 batch_normalization_232[0][0]

activation_237 (Activation) (None, None, None, 1 0 batch_normalization_237[0][0]

activation_238 (Activation) (None, None, None, 1 0 batch_normalization_238[0][0]

mixed5 (Concatenate) (None, None, None, 7 0 activation_229[0][0] activation_232[0][0] activation_237[0][0] activation_238[0][0]

conv2d_243 (Conv2D) (None, None, None, 1 122880 mixed5[0][0]

batch_normalization_243 (BatchN (None, None, None, 1 480 conv2d_243[0][0]

activation_243 (Activation) (None, None, None, 1 0 batch_normalization_243[0][0]

conv2d_244 (Conv2D) (None, None, None, 1 179200 activation_243[0][0]

batch_normalization_244 (BatchN (None, None, None, 1 480 conv2d_244[0][0]

activation_244 (Activation) (None, None, None, 1 0 batch_normalization_244[0][0]

conv2d_240 (Conv2D) (None, None, None, 1 122880 mixed5[0][0]

conv2d_245 (Conv2D) (None, None, None, 1 179200 activation_244[0][0]

batch_normalization_240 (BatchN (None, None, None, 1 480 conv2d_240[0][0]

batch_normalization_245 (BatchN (None, None, None, 1 480 conv2d_245[0][0]

activation_240 (Activation) (None, None, None, 1 0 batch_normalization_240[0][0]

activation_245 (Activation) (None, None, None, 1 0 batch_normalization_245[0][0]

conv2d_241 (Conv2D) (None, None, None, 1 179200 activation_240[0][0]

conv2d_246 (Conv2D) (None, None, None, 1 179200 activation_245[0][0]

batch_normalization_241 (BatchN (None, None, None, 1 480 conv2d_241[0][0]

batch_normalization_246 (BatchN (None, None, None, 1 480 conv2d_246[0][0]

activation_241 (Activation) (None, None, None, 1 0 batch_normalization_241[0][0]

activation_246 (Activation) (None, None, None, 1 0 batch_normalization_246[0][0]

average_pooling2d_24 (AveragePo (None, None, None, 7 0 mixed5[0][0]

conv2d_239 (Conv2D) (None, None, None, 1 147456 mixed5[0][0]

conv2d_242 (Conv2D) (None, None, None, 1 215040 activation_241[0][0]

conv2d_247 (Conv2D) (None, None, None, 1 215040 activation_246[0][0]

conv2d_248 (Conv2D) (None, None, None, 1 147456 average_pooling2d_24[0][0]

batch_normalization_239 (BatchN (None, None, None, 1 576 conv2d_239[0][0]

batch_normalization_242 (BatchN (None, None, None, 1 576 conv2d_242[0][0]

batch_normalization_247 (BatchN (None, None, None, 1 576 conv2d_247[0][0]

batch_normalization_248 (BatchN (None, None, None, 1 576 conv2d_248[0][0]

activation_239 (Activation) (None, None, None, 1 0 batch_normalization_239[0][0]

activation_242 (Activation) (None, None, None, 1 0 batch_normalization_242[0][0]

activation_247 (Activation) (None, None, None, 1 0 batch_normalization_247[0][0]

activation_248 (Activation) (None, None, None, 1 0 batch_normalization_248[0][0]

mixed6 (Concatenate) (None, None, None, 7 0 activation_239[0][0] activation_242[0][0] activation_247[0][0] activation_248[0][0]

conv2d_253 (Conv2D) (None, None, None, 1 147456 mixed6[0][0]

batch_normalization_253 (BatchN (None, None, None, 1 576 conv2d_253[0][0]

activation_253 (Activation) (None, None, None, 1 0 batch_normalization_253[0][0]

conv2d_254 (Conv2D) (None, None, None, 1 258048 activation_253[0][0]

batch_normalization_254 (BatchN (None, None, None, 1 576 conv2d_254[0][0]

activation_254 (Activation) (None, None, None, 1 0 batch_normalization_254[0][0]

conv2d_250 (Conv2D) (None, None, None, 1 147456 mixed6[0][0]

conv2d_255 (Conv2D) (None, None, None, 1 258048 activation_254[0][0]

batch_normalization_250 (BatchN (None, None, None, 1 576 conv2d_250[0][0]

batch_normalization_255 (BatchN (None, None, None, 1 576 conv2d_255[0][0]

activation_250 (Activation) (None, None, None, 1 0 batch_normalization_250[0][0]

activation_255 (Activation) (None, None, None, 1 0 batch_normalization_255[0][0]

conv2d_251 (Conv2D) (None, None, None, 1 258048 activation_250[0][0]

conv2d_256 (Conv2D) (None, None, None, 1 258048 activation_255[0][0]

batch_normalization_251 (BatchN (None, None, None, 1 576 conv2d_251[0][0]

batch_normalization_256 (BatchN (None, None, None, 1 576 conv2d_256[0][0]

activation_251 (Activation) (None, None, None, 1 0 batch_normalization_251[0][0]

activation_256 (Activation) (None, None, None, 1 0 batch_normalization_256[0][0]

average_pooling2d_25 (AveragePo (None, None, None, 7 0 mixed6[0][0]

conv2d_249 (Conv2D) (None, None, None, 1 147456 mixed6[0][0]

conv2d_252 (Conv2D) (None, None, None, 1 258048 activation_251[0][0]

conv2d_257 (Conv2D) (None, None, None, 1 258048 activation_256[0][0]

conv2d_258 (Conv2D) (None, None, None, 1 147456 average_pooling2d_25[0][0]

batch_normalization_249 (BatchN (None, None, None, 1 576 conv2d_249[0][0]

batch_normalization_252 (BatchN (None, None, None, 1 576 conv2d_252[0][0]

batch_normalization_257 (BatchN (None, None, None, 1 576 conv2d_257[0][0]

batch_normalization_258 (BatchN (None, None, None, 1 576 conv2d_258[0][0]

activation_249 (Activation) (None, None, None, 1 0 batch_normalization_249[0][0]

activation_252 (Activation) (None, None, None, 1 0 batch_normalization_252[0][0]

activation_257 (Activation) (None, None, None, 1 0 batch_normalization_257[0][0]

activation_258 (Activation) (None, None, None, 1 0 batch_normalization_258[0][0]

mixed7 (Concatenate) (None, None, None, 7 0 activation_249[0][0] activation_252[0][0] activation_257[0][0] activation_258[0][0]

conv2d_261 (Conv2D) (None, None, None, 1 147456 mixed7[0][0]

batch_normalization_261 (BatchN (None, None, None, 1 576 conv2d_261[0][0]

activation_261 (Activation) (None, None, None, 1 0 batch_normalization_261[0][0]

conv2d_262 (Conv2D) (None, None, None, 1 258048 activation_261[0][0]

batch_normalization_262 (BatchN (None, None, None, 1 576 conv2d_262[0][0]

activation_262 (Activation) (None, None, None, 1 0 batch_normalization_262[0][0]

conv2d_259 (Conv2D) (None, None, None, 1 147456 mixed7[0][0]

conv2d_263 (Conv2D) (None, None, None, 1 258048 activation_262[0][0]

batch_normalization_259 (BatchN (None, None, None, 1 576 conv2d_259[0][0]

batch_normalization_263 (BatchN (None, None, None, 1 576 conv2d_263[0][0]

activation_259 (Activation) (None, None, None, 1 0 batch_normalization_259[0][0]

activation_263 (Activation) (None, None, None, 1 0 batch_normalization_263[0][0]

conv2d_260 (Conv2D) (None, None, None, 3 552960 activation_259[0][0]

conv2d_264 (Conv2D) (None, None, None, 1 331776 activation_263[0][0]

batch_normalization_260 (BatchN (None, None, None, 3 960 conv2d_260[0][0]

batch_normalization_264 (BatchN (None, None, None, 1 576 conv2d_264[0][0]

activation_260 (Activation) (None, None, None, 3 0 batch_normalization_260[0][0]

activation_264 (Activation) (None, None, None, 1 0 batch_normalization_264[0][0]

max_pooling2d_12 (MaxPooling2D) (None, None, None, 7 0 mixed7[0][0]

mixed8 (Concatenate) (None, None, None, 1 0 activation_260[0][0] activation_264[0][0] max_pooling2d_12[0][0]

conv2d_269 (Conv2D) (None, None, None, 4 573440 mixed8[0][0]

batch_normalization_269 (BatchN (None, None, None, 4 1344 conv2d_269[0][0]

activation_269 (Activation) (None, None, None, 4 0 batch_normalization_269[0][0]

conv2d_266 (Conv2D) (None, None, None, 3 491520 mixed8[0][0]

conv2d_270 (Conv2D) (None, None, None, 3 1548288 activation_269[0][0]

batch_normalization_266 (BatchN (None, None, None, 3 1152 conv2d_266[0][0]

batch_normalization_270 (BatchN (None, None, None, 3 1152 conv2d_270[0][0]

activation_266 (Activation) (None, None, None, 3 0 batch_normalization_266[0][0]

activation_270 (Activation) (None, None, None, 3 0 batch_normalization_270[0][0]

conv2d_267 (Conv2D) (None, None, None, 3 442368 activation_266[0][0]

conv2d_268 (Conv2D) (None, None, None, 3 442368 activation_266[0][0]

conv2d_271 (Conv2D) (None, None, None, 3 442368 activation_270[0][0]

conv2d_272 (Conv2D) (None, None, None, 3 442368 activation_270[0][0]

average_pooling2d_26 (AveragePo (None, None, None, 1 0 mixed8[0][0]

conv2d_265 (Conv2D) (None, None, None, 3 409600 mixed8[0][0]

batch_normalization_267 (BatchN (None, None, None, 3 1152 conv2d_267[0][0]

batch_normalization_268 (BatchN (None, None, None, 3 1152 conv2d_268[0][0]

batch_normalization_271 (BatchN (None, None, None, 3 1152 conv2d_271[0][0]

batch_normalization_272 (BatchN (None, None, None, 3 1152 conv2d_272[0][0]

conv2d_273 (Conv2D) (None, None, None, 1 245760 average_pooling2d_26[0][0]

batch_normalization_265 (BatchN (None, None, None, 3 960 conv2d_265[0][0]

activation_267 (Activation) (None, None, None, 3 0 batch_normalization_267[0][0]

activation_268 (Activation) (None, None, None, 3 0 batch_normalization_268[0][0]

activation_271 (Activation) (None, None, None, 3 0 batch_normalization_271[0][0]

activation_272 (Activation) (None, None, None, 3 0 batch_normalization_272[0][0]

batch_normalization_273 (BatchN (None, None, None, 1 576 conv2d_273[0][0]

activation_265 (Activation) (None, None, None, 3 0 batch_normalization_265[0][0]

mixed9_0 (Concatenate) (None, None, None, 7 0 activation_267[0][0] activation_268[0][0]

concatenate_5 (Concatenate) (None, None, None, 7 0 activation_271[0][0] activation_272[0][0]

activation_273 (Activation) (None, None, None, 1 0 batch_normalization_273[0][0]

mixed9 (Concatenate) (None, None, None, 2 0 activation_265[0][0] mixed9_0[0][0] concatenate_5[0][0] activation_273[0][0]

conv2d_278 (Conv2D) (None, None, None, 4 917504 mixed9[0][0]

batch_normalization_278 (BatchN (None, None, None, 4 1344 conv2d_278[0][0]

activation_278 (Activation) (None, None, None, 4 0 batch_normalization_278[0][0]

conv2d_275 (Conv2D) (None, None, None, 3 786432 mixed9[0][0]

conv2d_279 (Conv2D) (None, None, None, 3 1548288 activation_278[0][0]

batch_normalization_275 (BatchN (None, None, None, 3 1152 conv2d_275[0][0]

batch_normalization_279 (BatchN (None, None, None, 3 1152 conv2d_279[0][0]

activation_275 (Activation) (None, None, None, 3 0 batch_normalization_275[0][0]

activation_279 (Activation) (None, None, None, 3 0 batch_normalization_279[0][0]

conv2d_276 (Conv2D) (None, None, None, 3 442368 activation_275[0][0]

conv2d_277 (Conv2D) (None, None, None, 3 442368 activation_275[0][0]

conv2d_280 (Conv2D) (None, None, None, 3 442368 activation_279[0][0]

conv2d_281 (Conv2D) (None, None, None, 3 442368 activation_279[0][0]

average_pooling2d_27 (AveragePo (None, None, None, 2 0 mixed9[0][0]

conv2d_274 (Conv2D) (None, None, None, 3 655360 mixed9[0][0]

batch_normalization_276 (BatchN (None, None, None, 3 1152 conv2d_276[0][0]

batch_normalization_277 (BatchN (None, None, None, 3 1152 conv2d_277[0][0]

batch_normalization_280 (BatchN (None, None, None, 3 1152 conv2d_280[0][0]

batch_normalization_281 (BatchN (None, None, None, 3 1152 conv2d_281[0][0]

conv2d_282 (Conv2D) (None, None, None, 1 393216 average_pooling2d_27[0][0]

batch_normalization_274 (BatchN (None, None, None, 3 960 conv2d_274[0][0]

activation_276 (Activation) (None, None, None, 3 0 batch_normalization_276[0][0]

activation_277 (Activation) (None, None, None, 3 0 batch_normalization_277[0][0]

activation_280 (Activation) (None, None, None, 3 0 batch_normalization_280[0][0]

activation_281 (Activation) (None, None, None, 3 0 batch_normalization_281[0][0]

batch_normalization_282 (BatchN (None, None, None, 1 576 conv2d_282[0][0]

activation_274 (Activation) (None, None, None, 3 0 batch_normalization_274[0][0]

mixed9_1 (Concatenate) (None, None, None, 7 0 activation_276[0][0] activation_277[0][0]

concatenate_6 (Concatenate) (None, None, None, 7 0 activation_280[0][0] activation_281[0][0]

activation_282 (Activation) (None, None, None, 1 0 batch_normalization_282[0][0]

mixed10 (Concatenate) (None, None, None, 2 0 activation_274[0][0] mixed9_1[0][0] concatenate_6[0][0] activation_282[0][0]

global_average_pooling2d_5 (Glo (None, 2048) 0 mixed10[0][0]

dense_9 (Dense) (None, 1024) 2098176 global_average_pooling2d_5[0][0]

dense_10 (Dense) (None, 10618) 10883450 dense_9[0][0]

Total params: 34,784,410 Trainable params: 34,749,978 Non-trainable params: 34,432

history_transfer_learning = model.fit_generator( train_generator, epochs=num_epoch, steps_per_epoch = num_train_samples, validation_data = validation_generator, validation_steps = num_validate_samples, class_weight = ‘auto’

) ValueError: Error when checking target: expected dense_10 to have shape (10618,) but got array with shape (10,)

If you are using one hot encoded labels(output variables), then use ‘categorical_crossentropy’, else if you are using integers output, then use ‘sparse_categorical_crossentropy’. In my case, it helped.

ValueError: Error when checking target: expected dense_14 to have shape (None, 2) but got array with shape (928, 1)

My Solution: Change=> model.add(Dense(2, activation=‘sigmoid’)) -----> model.add(Dense(1, activation=‘sigmoid’)) if (928,1), set Dense as 1 if(x,y) set Dense as y

Yes, during my troubleshooting I had mistakenly specified the last dense layer as 2 when it should have been 1. This resolved the issue.

@camhodges101 Can you please specify what you did again? As I am facing the exact same issue

I had same Issue, It was due to one-hot-encoding that keras.utils.to_categorical(labels) line did. My labels were ascii values of {a,b,c,…,z} which is {97, 98, 99, … , 122}, therefore one hot encoding vectors were of size 123.

To solve this I had subtracted 97 from each label.