TransWikia.com

While retraining a pretrained model, getting: ValueError: Input 0 is incompatible with layer flatten_1: expected min_ndim=3, found ndim=2

Data Science Asked on January 2, 2022

My model summary is:

Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_1 (Conv2D)            (None, 62, 62, 32)        896       
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 31, 31, 32)        0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 29, 29, 32)        9248      
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 14, 14, 32)        0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 6272)              0         
_________________________________________________________________
dense_1 (Dense)              (None, 128)               802944    
_________________________________________________________________
dense_2 (Dense)              (None, 4)                 516       

While I am re-training this model using the below function:

enter image description here

Im facing this error:

ValueError: Input 0 is incompatible with layer flatten_1: expected min_ndim=3, found ndim=2

One Answer

The reason for this error is that you are trying to flatten an already flat layer. The output of your model is (batch_size, 4), which cannot be flattened further. To simply fix the error remove the flatten layer from your code.

However, when fine-tuning a pretrained model, you should first remove the top layers of that model before adding your own. The reason is that these layers are trained for classification on a task different than yours.

If I were you I'd drop the last two layers of your pretrained model:

# code same as before ...

x = model.layers[-3].output  # Flatten layer output

# don't add flatten again

for fc in fc_layers:
    x = Dense(fc, activation='relu')(x)
    x = Dropout(dropout)(x)

predictions = Dense(num_classes, activation='softmax')(x)

finetune_model = Model(inputs=model.input, outputs=predictions)

Answered by Djib2011 on January 2, 2022

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP