As I was training UNET, the dice coef and iou sometimes become greater than 1 and iou > dice
, then after several batches they would become normal again.
As shown in the picture.
I have defined them as following:
def dice_coef(y_true, y_pred, smooth=1):
y_true_f = K.flatten(y_true)
y_pred_f = K.flatten(y_pred)
intersection = K.sum(y_true_f * y_pred_f)
return (2. * intersection + smooth) / (K.sum(y_true_f) + K.sum(y_pred_f) + smooth)
def iou(y_true, y_pred, smooth=1):
y_true_f = K.flatten(y_true)
y_pred_f = K.flatten(y_pred)
intersection = K.sum(y_true_f * y_pred_f)
union = K.sum(y_true_f) + K.sum(y_pred_f) - intersection
return (intersection + smooth) / (union + smooth)
def dice_loss(y_true, y_pred):
return 1. - dice_coef(y_true, y_pred)
I have tried adding K.abs()
to y_pred but that results in a worse performance. I feel that since the output is sigmoid activated whether adding K.abs()
or not should give the same result? Also, as you can see my accuracy is weird, I have been relying on dice to judge my model performance, would be greater if someone can point out the issue.
y_true
images really between 0 and 1? Check this. – Ferrocenemodel.summary()
, the definition of your layers (at least the final layers), the "measured" shapes and ranges ofy_true
. (I know you said it's ok, but sometimes we get mistaken, every one of us :p) – Ferroceney_true
values they were not normalized, so just make sure you normalize labels. – Dictate