Different loss function for validation set in Keras
Asked Answered
J

2

7

I have unbalanced training dataset, thats why I built custom weighted categorical cross entropy loss function. But the problem is my validation set is balanced one and I want to use the regular categorical cross entropy loss. So can I pass different loss function for validation set within Keras? I mean the wighted one for training and regular one for validation set?

def weighted_loss(y_pred, y_ture):
 '
 '
 '


return loss

model.compile(loss= weighted_loss, metric='accuracy')
Joettejoey answered 31/8, 2018 at 2:8 Comment(0)
I
16

You can try the backend function K.in_train_phase(), which is used by the Dropout and BatchNormalization layers to implement different behaviors in training and validation.

def custom_loss(y_true, y_pred):
    weighted_loss = ... # your implementation of weighted crossentropy loss
    unweighted_loss = K.sparse_categorical_crossentropy(y_true, y_pred)
    return K.in_train_phase(weighted_loss, unweighted_loss)

The first argument of K.in_train_phase() is the tensor used in training phase, and the second is the one used in test phase.

For example, if we set weighted_loss to 0 (just to verify the effect of K.in_train_phase() function):

def custom_loss(y_true, y_pred):
    weighted_loss = 0 * K.sparse_categorical_crossentropy(y_true, y_pred)
    unweighted_loss = K.sparse_categorical_crossentropy(y_true, y_pred)
    return K.in_train_phase(weighted_loss, unweighted_loss)

model = Sequential([Dense(100, activation='relu', input_shape=(100,)), Dense(1000, activation='softmax')])
model.compile(optimizer='adam', loss=custom_loss)
model.outputs[0]._uses_learning_phase = True  # required if no dropout or batch norm in the model

X = np.random.rand(1000, 100)
y = np.random.randint(1000, size=1000)
model.fit(X, y, validation_split=0.1)

Epoch 1/10
900/900 [==============================] - 1s 868us/step - loss: 0.0000e+00 - val_loss: 6.9438

As you can see, the loss in training phase is indeed the one multiplied by 0.

Note that if there's no dropout or batch norm in your model, you'll need to manually "turn on" the _uses_learning_phase boolean switch, otherwise K.in_train_phase() will have no effect by default.

Independence answered 1/9, 2018 at 18:44 Comment(11)
Thats what I am looking for. Thank you Yu-Yang. Just one question about the last thing you said about the turning on '_uses_learning_phase'. I think the default value is '0' for test, and '1' for training, my model have batch_normalizaition and dropout layer. So do I need to turn it on manually ?Joettejoey
And Do you mean 'training' flag, because I didn't find '_uses_learning_phase' flag. I mean in tf.keras.backend.in_train_phase there is only ( x, alt, training=None )Joettejoey
The _uses_learning_phase I've mentioned is a different thing. It's a boolean variable that controls whether the "learning phase" variable (i.e., the one you've mentioned -- 0 for test, and 1 for training) will have any effect in model training. If you have dropout in your model, then you shouldn't need to turn it on manually.Independence
_uses_learning_phase is an internal variable that will be attached to model outputs if there's any component (e.g., dropout, batch norm) that acts differently in training/validation.Independence
I understand. Thank you Yu-YangJoettejoey
Sure. Glad to help.Independence
Sorry one more last thing. If I don't have batch normalization or dropout layer and need to turn it on, Should I do it by setting it vi 'set_variable()"? I didn't find any documentation about this variable neither in tensor flow or KerasJoettejoey
You can choose any output tensor of your model and set its _uses_learning_phase = True, like what I've done in the example in this answer (model.outputs[0]._uses_learning_phase = True). It's an implementation detail so I think it's unlikely that it would be documented anywhere.Independence
Sorry I didn't see that line where you turn it on. My mistake. I understand now. ThanksJoettejoey
you seemed so qualified with loss functions, can you see my question here and see if you can help me. I will appreciate that. #52270103Joettejoey
@Independence Thanks for this answer. Sir, is there any way we could use sampled_softmax_loss function in k.in_train_phase() because sampled_softmax expects y_true, inputs where as sparse_softmax expects y_true, y_pred.. Is there any way I could implement it?Selfrestraint
J
2

The validation loss function is just a metric and actually not needed for training. It's there because it make sense to compare the metrics which your network is actually optimzing on. So you can add any other loss function as metric during compilation and you'll see it during training.

Johen answered 31/8, 2018 at 8:19 Comment(4)
I know the value reported by validation loss at the end of each epoch is just for optimization purposes and to see how far is your model is good. But when the validation set is balanced thats mean the value of validation loss reported at each epoch is wrong number to be look at it and to tune the mode because it is based on training unbalanced set. Am I wright? And I don't understand when you say that I can add any other loss function as metric can you explain more. I need to have loss function that have different weight in training from validation set.Joettejoey
Sounds right to me. As for the metric: keras model.compile has a metrics parameter in which you can pass metric functions like accuracy. Those metrics will be evaluated on epoch end both on training and evaluation set. So you can add your custom weightes loss function using different weights. If this isn‘t possible, please show some code on how you pass your custom loss function as model loss function.Johen
I modify the post to include simple code. I think I understand what you mean. you meant passing the normal categorical cross entropy loss as a metric in order to report the value of accurate validation loss. But then what about the accuracy metric that I want it for model evaluation, can I pass two metric for evaluation?Joettejoey
Yes you can pass an array of metrics with as many as you wantJohen

© 2022 - 2024 — McMap. All rights reserved.