Get Confusion Matrix From a Keras Multiclass Model [duplicate]
Asked Answered
M

1

42

I am building a multiclass model with Keras.

model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
model.fit(X_train, y_train, batch_size=batch_size, epochs=epochs, verbose=1, callbacks=[checkpoint], validation_data=(X_test, y_test))  # starts training

Here is how my test data looks like (it's text data).

X_test
Out[25]: 
array([[621, 139, 549, ...,   0,   0,   0],
       [621, 139, 543, ...,   0,   0,   0]])

y_test
Out[26]: 
array([[0, 0, 1],
       [0, 1, 0]])

After generating predictions...

predictions = model.predict(X_test)
predictions
Out[27]: 
array([[ 0.29071924,  0.2483743 ,  0.46090645],
       [ 0.29566404,  0.45295066,  0.25138539]], dtype=float32)

I did the following to get the confusion matrix.

y_pred = (predictions > 0.5)

confusion_matrix(y_test, y_pred)
Traceback (most recent call last):

  File "<ipython-input-38-430e012b2078>", line 1, in <module>
    confusion_matrix(y_test, y_pred)

  File "/Users/abrahammathew/anaconda3/lib/python3.6/site-packages/sklearn/metrics/classification.py", line 252, in confusion_matrix
    raise ValueError("%s is not supported" % y_type)

ValueError: multilabel-indicator is not supported

However, I am getting the above error.

How can I get a confusion matrix when doing a multiclass neural network in Keras?

Meristic answered 19/6, 2018 at 4:58 Comment(0)
T
58

Your input to confusion_matrix must be an array of int not one hot encodings.

matrix = metrics.confusion_matrix(y_test.argmax(axis=1), y_pred.argmax(axis=1))
Tallow answered 19/6, 2018 at 6:4 Comment(3)
In case it's too subtle, this answer clarifies that the question was asked about sklearn.metrics.confusion_matrix(), not tensorflow.math.confusion_matrix(), which might be expected given the tag kerasReporter
How can I use it for the image dataset?Armil
@JakeStevens-Haas Thx: that is actually a material flaw in the answer: this answer should be rewrittenEngird

© 2022 - 2024 — McMap. All rights reserved.