0

I cannot find anywhere how exactly is backpropagation done in Keras? Let me explain:

Lets say i have network

input = Input(shape=(X,X,Y))
x = Conv2D(32,(3,3),padding="same")(input)
x = Conv2D(64,(3,3),padding="same")(x)
x = Conv2D(128,(3,3),padding="same")(x)
x = Conv2D(64,(3,3),padding="same")(x)
Output = Flatten(1024)(x)
Output = Flatten(6)(Output)
model = Model(input,Output)
model.compile(loss="mean_squared_error", optimizer=keras.optimizers.Adam(),metrics=['accuracy'])
model.fit(trainingData,trainingLabels)

The output of last layer is compared to trainingLabels, mean squared error is computed and Backpropagation happens based on the value of mean squaed error

However, what if i wanted to something more. And for example I want to try every permutation of output vector, and the one that results in minimal mean squared error should be treated as output, thus Backpropagation happens based on permutation with least error.

Is something like this possible in Keras? If so, how can i achieve it

Darlyn
  • 4,152
  • 11
  • 30
  • 73

1 Answers1

0

The loss argument of model.compile method accepts a python function. You could compute the minimum over the set of permutations in a custom function:

def custom_loss(y_true, y_predicted):
    ''' code here '''

and then pass it to model.compile:

model.compile(loss=custom_loss, 
              optimizer=keras.optimizers.Adam(),
              metrics=['accuracy'])

See here and here for reference.

Yuri Lifanov
  • 257
  • 1
  • 2
  • 10