1

I have a set of 120x120 input images with 3 channels. I want to build a basic CNN to predict the value of each pixel. I have 2 doubts. One is regarding the last layer - should be a Dense layer, or a Conv2D?

model_CNN4 = keras.models.Sequential([
    Conv2D(32, (3, 3), input_shape=[pixel, pixel, 3]),
    BatchNormalization(),
    LeakyReLU(alpha=0.1),  # You can adjust the alpha parameter for the leaky ReLU
MaxPooling2D(),

Conv2D(64, (2, 2)),
BatchNormalization(),
LeakyReLU(alpha=0.1),

MaxPooling2D(),

Conv2D(64, (2, 2)),
LeakyReLU(alpha=0.1),

Flatten(),
Dense(200),
BatchNormalization(),
LeakyReLU(alpha=0.1),

Dense(pixel * pixel, activation='linear')

])

The other question is, assuming it is correct - or rather, that makes sense, it only runs if my y_train is of shape:

y_train shape : [number of samples, 120*120]

But shouldn't it be instead?:

y_train shape: [number of samples, 120,120,1]

Thanks

1 Answers1

1

The last layer should match the dimension of your response. Dense layer can only return 1D whereas conv layers can return 2D.,

Based on your question, you want to classify all your pixels (binary?) in the images, right? Then you have two options:

  1. Dense layer, how you do it, you basically drop your image into a vector, that is why you specify Dense(pixel * pixel, activation='linear') and why then the y_train must be of shape [None, 128*128] (so that the output of the dense layer matches your y_train).

  2. Use a Reshape layer to bring the output into the desired shape:

    model_CNN4 = keras.models.Sequential([

     Conv2D(32, (3, 3), input_shape=[pixel, pixel, 3]),
     BatchNormalization(),
     LeakyReLU(alpha=0.1),  # You can adjust the alpha parameter for the leaky ReLU
    

    MaxPooling2D(),

    Conv2D(64, (2, 2)), BatchNormalization(), LeakyReLU(alpha=0.1),

    MaxPooling2D(),

    Conv2D(64, (2, 2)), LeakyReLU(alpha=0.1),

    Flatten(), Dense(200), BatchNormalization(), LeakyReLU(alpha=0.1),

    Dense(pixel * pixel, activation='linear'), Reshape([128,128,1]) ])

Max
  • 31
  • 2