is Cross Entropy With Softmax proper for Multi-label Classification?
Asked Answered
F

1

5

As mentioned here, cross entropy is not a proper loss function for multi-label classification. My question is "is this fact true for cross entropy with softmax too?". If it is, how it can be matched with this part of the document.

I should mention that the scope of my question is in cntk.

Fixate answered 17/1, 2017 at 12:55 Comment(0)
P
8

Multilabel classification typically means "many binary labels". With that definition in mind, cross entropy with softmax is not appropriate for multilabel classification. The document in the second link you provide talks about multiclass problems not multilabel problems. Cross entropy with softmax is appropriate for multiclass classification. For multilabel classification a common choice is to use the sum of binary cross entropies of each labels. The binary cross entropy can be computed with Logistic in Brainscript or with binary_cross_entropy in Python.

If on the other hand you are a problem with many multiclass labels, then you can use cross_entropy_with_softmax for each of them and CNTK will automatically sum all these loss values.

Preconcerted answered 18/1, 2017 at 6:0 Comment(2)
The title of the linked document is "Train a multilabel classifier", so how did you said "The document you link to talks about multiclass problems not multilabel problems"?Fixate
You have two links in your question. I was refering to the other one. Updated the answer to clarify this.Preconcerted

© 2022 - 2024 — McMap. All rights reserved.