Semantic Segmentation Loss functions
J

1

10

Does it make sense to combine cross-entropy loss and dice-score in a weighted fashion for a binary segmentation problem ?

Optimizing the dice-score produces over segmented regions, while cross entropy loss produces under-segmented regions for my application.

Joellejoellen answered 24/5, 2018 at 20:24 Comment(1)
In case someone needs a binary segmentation toy problem: osf.io/snb6pPincus
M
6

I suppose there's no harm in combining the two losses as they are quite "orthogonal" to each other; while cross-entropy treats every pixel as an independent prediction, dice-score looks at the resulting mask in a more "holistic" way.
Moreover, considering the fact that these two losses yields significantly different masks, each with its own merits and errors, I suppose combining this complementary information should be beneficial.
Make sure you weight the losses such that the gradients from the two losses are roughly in the same scale so you can equally benefit from both.

If you make it work, I'd be interested to hear about your experiments and conclusions ;)

Moser answered 31/5, 2018 at 6:47 Comment(3)
How would one scale the two losses so that the gradients from the two losses are the same scale? I am doing something similar as well. I just passed in a class_weights vector based on the imbalance of the dataset. So the weighted loss becomes something like weight*(BCE-DICE) where the weight changes for every class. But is there a better way to do it?Bacon
How to make sure you weight the losses such that the gradients from the two losses are roughly in the same scale, assuming loss = alpha * bce + beta * dice.Welladvised
Hi @Shai, what do you mean when you say loss functions are "orthogonal"? Is that to do with gradient flow because you assume they each guide the network to learn different independent features (for example, dice loss would train the network to learn holistic structure whilst the entropy loss would aim for pixel wise consistency?)Unlive

© 2022 - 2024 — McMap. All rights reserved.