What is a Confusion Matrix in Machine Learning

Make the Confusion Matrix Less Confusing.

A confusion matrix is a technique for summarizing the performance of a classification algorithm.

Classification accuracy alone can be misleading if you have an unequal number of observations in each class or if you have more than two classes in your dataset.

Calculating a confusion matrix can give you a better idea of what your classification model is getting right and what types of errors it is making.

In this post, you will discover the confusion matrix for use in machine learning.

After reading this post you will know:

  • What the confusion matrix is and why you need to use it.
  • How to calculate a confusion matrix for a 2-class classification problem from scratch.
  • How create a confusion matrix in Weka, Python and R.

Kick-start your project with my new book Machine Learning Algorithms From Scratch, including step-by-step tutorials and the Python source code files for all examples.

Let’s get started.

  • Update Oct/2017: Fixed a small bug in the worked example (thanks Raktim).
  • Update Dec/2017: Fixed a small bug in accuracy calculation (thanks Robson Pastor Alexandre)
What is a Confusion Matrix in Machine Learning

What is a Confusion Matrix in Machine Learning
Photo by Maximiliano Kolus, some rights reserved

Classification Accuracy and its Limitations

Classification accuracy is the ratio of correct predictions to total predictions made.

It is often presented as a percentage by multiplying the result by 100.

Classification accuracy can also easily be turned into a misclassification rate or error rate by inverting the value, such as:

Classification accuracy is a great place to start, but often encounters problems in practice.

The main problem with classification accuracy is that it hides the detail you need to better understand the performance of your classification model. There are two examples where you are most likely to encounter this problem:

  1. When your data has more than 2 classes. With 3 or more classes you may get a classification accuracy of 80%, but you don’t know if that is because all classes are being predicted equally well or whether one or two classes are being neglected by the model.
  2. When your data does not have an even number of classes. You may achieve accuracy of 90% or more, but this is not a good score if 90 records for every 100 belong to one class and you can achieve this score by always predicting the most common class value.

Classification accuracy can hide the detail you need to diagnose the performance of your model. But thankfully we can tease apart this detail by using a confusion matrix.

What is a Confusion Matrix?

A confusion matrix is a summary of prediction results on a classification problem.

The number of correct and incorrect predictions are summarized with count values and broken down by each class. This is the key to the confusion matrix.

The confusion matrix shows the ways in which your classification model
is confused when it makes predictions.

It gives you insight not only into the errors being made by your classifier but more importantly the types of errors that are being made.

It is this breakdown that overcomes the limitation of using classification accuracy alone.

How to Calculate a Confusion Matrix

Below is the process for calculating a confusion Matrix.

  1. You need a test dataset or a validation dataset with expected outcome values.
  2. Make a prediction for each row in your test dataset.
  3. From the expected outcomes and predictions count:
    1. The number of correct predictions for each class.
    2. The number of incorrect predictions for each class, organized by the class that was predicted.

These numbers are then organized into a table, or a matrix as follows:

  • Expected down the side: Each row of the matrix corresponds to a predicted class.
  • Predicted across the top: Each column of the matrix corresponds to an actual class.

The counts of correct and incorrect classification are then filled into the table.

The total number of correct predictions for a class go into the expected row for that class value and the predicted column for that class value.

In the same way, the total number of incorrect predictions for a class go into the expected row for that class value and the predicted column for that class value.

In practice, a binary classifier such as this one can make two types of errors: it can incorrectly assign an individual who defaults to the no default category, or it can incorrectly assign an individual who does not default to the default category. It is often of interest to determine which of these two types of errors are being made. A confusion matrix […] is a convenient way to display this information.

— Page 145, An Introduction to Statistical Learning: with Applications in R, 2014

This matrix can be used for 2-class problems where it is very easy to understand, but can easily be applied to problems with 3 or more class values, by adding more rows and columns to the confusion matrix.

Let’s make this explanation of creating a confusion matrix concrete with an example.

2-Class Confusion Matrix Case Study

Let’s pretend we have a two-class classification problem of predicting whether a photograph contains a man or a woman.

We have a test dataset of 10 records with expected outcomes and a set of predictions from our classification algorithm.

Let’s start off and calculate the classification accuracy for this set of predictions.

The algorithm made 7 of the 10 predictions correct with an accuracy of 70%.

But what type of errors were made?

Let’s turn our results into a confusion matrix.

First, we must calculate the number of correct predictions for each class.

Now, we can calculate the number of incorrect predictions for each class, organized by the predicted value.

We can now arrange these values into the 2-class confusion matrix:

We can learn a lot from this table.

  • The total actual men in the dataset is the sum of the values on the men column (3 + 2)
  • The total actual women in the dataset is the sum of values in the women column (1 +4).
  • The correct values are organized in a diagonal line from top left to bottom-right of the matrix (3 + 4).
  • More errors were made by predicting men as women than predicting women as men.

Two-Class Problems Are Special

In a two-class problem, we are often looking to discriminate between observations with a specific outcome, from normal observations.

Such as a disease state or event from no disease state or no event.

In this way, we can assign the event row as “positive” and the no-event row as “negative“. We can then assign the event column of predictions as “true” and the no-event as “false“.

This gives us:

  • true positive” for correctly predicted event values.
  • false positive” for incorrectly predicted event values.
  • true negative” for correctly predicted no-event values.
  • false negative” for incorrectly predicted no-event values.

We can summarize this in the confusion matrix as follows:

This can help in calculating more advanced classification metrics such as precision, recall, specificity and sensitivity of our classifier.

For example, classification accuracy is calculated as true positives + true negatives.

Consider the case where there are two classes. […] The top row of the table corresponds to samples predicted to be events. Some are predicted correctly (the true positives, or TP) while others are inaccurately classified (false positives or FP). Similarly, the second row contains the predicted negatives with true negatives (TN) and false negatives (FN).

— Page 256, Applied Predictive Modeling, 2013

Now that we have worked through a simple 2-class confusion matrix case study, let’s see how we might calculate a confusion matrix in modern machine learning tools.

Code Examples of the Confusion Matrix

This section provides some example of confusion matrices using top machine learning platforms.

These examples will give you a context for what you have learned about the confusion matrix for when you use them in practice with real data and tools.

Example Confusion Matrix in Weka

The Weka machine learning workbench will display a confusion matrix automatically when estimating the skill of a model in the Explorer interface.

Below is a screenshot from the Weka Explorer interface after training a k-nearest neighbor algorithm on the Pima Indians Diabetes dataset.

The confusion matrix is listed at the bottom, and you can see that a wealth of classification statistics are also presented.

The confusion matrix assigns letters a and b to the class values and provides expected class values in rows and predicted class values (“classified as”) for each column.

Weka Confusion Matrix and Classification Statistics

Weka Confusion Matrix and Classification Statistics

You can learn more about the Weka Machine Learning Workbench here.

Example Confusion Matrix in Python with scikit-learn

The scikit-learn library for machine learning in Python can calculate a confusion matrix.

Given an array or list of expected values and a list of predictions from your machine learning model, the confusion_matrix() function will calculate a confusion matrix and return the result as an array. You can then print this array and interpret the results.

Running this example prints the confusion matrix array summarizing the results for the contrived 2 class problem.

Learn more about the confusion_matrix() function in the scikit-learn API documentation.

Example Confusion Matrix in R with caret

The caret library for machine learning in R can calculate a confusion matrix.

Given a list of expected values and a list of predictions from your machine learning model, the confusionMatrix() function will calculate a confusion matrix and return the result as a detailed report. You can then print this report and interpret the results.

Running this example calculates a confusion matrix report and related statistics and prints the results.

There is a wealth of information in this report, not least the confusion matrix itself.

Learn more about the confusionMatrix() function in the caret API documentation [PDF].

Further Reading

There is not a lot written about the confusion matrix, but this section lists some additional resources that you may be interested in reading.

Summary

In this post, you discovered the confusion matrix for machine learning.

Specifically, you learned about:

  • The limitations of classification accuracy and when it can hide important details.
  • The confusion matrix and how to calculate it from scratch and interpret the results.
  • How to calculate a confusion matrix with the Weka, Python scikit-learn and R caret libraries.

Do you have any questions?
Ask your question in the comments below and I will do my best to answer them.

Discover How to Code Algorithms From Scratch!

Machine Learning Algorithms From Scratch

No Libraries, Just Python Code.

...with step-by-step tutorials on real-world datasets

Discover how in my new Ebook:
Machine Learning Algorithms From Scratch

It covers 18 tutorials with all the code for 12 top algorithms, like:
Linear Regression, k-Nearest Neighbors, Stochastic Gradient Descent and much more...

Finally, Pull Back the Curtain on
Machine Learning Algorithms

Skip the Academics. Just Results.

See What's Inside

159 Responses to What is a Confusion Matrix in Machine Learning

  1. Avatar
    Vinay November 18, 2016 at 9:42 pm #

    Nice example. I have two single dimensional array:one is predicted and other is expected. It is not a binary classification. It is a five class classification problem. How to compute confusion matrix and true positive, true negative, false positive, false negative.

    • Avatar
      Jason Brownlee November 19, 2016 at 8:47 am #

      Hi Vinay, you can extrapolate from the examples above.

    • Avatar
      Avinash November 3, 2018 at 3:16 am #

      Hey Vinay did you got the solution for the problem ?? I’m facing the similar problem right now.

    • Avatar
      zinash April 11, 2024 at 2:33 am #

      hi it is very helpful explanation for me.my question is the confusion matrix predicted and actual value placement different for WEKA and python?

  2. Avatar
    Shai March 19, 2017 at 7:19 pm #

    Nice , very good explanation.

  3. Avatar
    Ananya Mohapatra March 31, 2017 at 9:45 pm #

    hello sir,
    Can we implement confusion matrix in multi-class neural network program using K-fold cross validation??

    • Avatar
      Jason Brownlee April 1, 2017 at 5:55 am #

      Yes, but you would have one matrix for each fold of your cross validation.

      It would be better method for a train/test split.

  4. Avatar
    pakperchum May 3, 2017 at 2:56 pm #

    Using classification Learner app of MATLAB and I obtained the confusion matrix, Can I show the classification results in image? how? Please guide

  5. Avatar
    shafaq May 3, 2017 at 2:58 pm #

    Using Weka and Tanagra, naive Bayes classification leads to a confusion matrix, How I can show the classification results in the form of image instead of confusion matrix?
    Guide please

  6. Avatar
    Shafaq May 6, 2017 at 3:40 pm #

    “Lena” noisy image taken as base on which noise detection feature applied after that matrix of features passed as training set. Now I want to take output in the form of image (Lena) but Tanagra and weka shows confusion matrix or ROC curve (can show scatter plot) through naive Bayes classification. Help plz

  7. Avatar
    cc May 8, 2017 at 8:50 pm #

    how to write confusion matrix for n image in one table

    • Avatar
      Jason Brownlee May 9, 2017 at 7:41 am #

      You have one row/column for each class, not each input (e.g. each image).

  8. Avatar
    Giorgos May 20, 2017 at 7:11 am #

    Hello Jason, I have a 3 and a 4 class problem, and I have made their confusion matrix but I cant understand which of the cells represents the true positive,false positive,false negative, in the binary class problem its more easy to understand it, can you help me?

  9. Avatar
    Amanze Chibuike May 28, 2017 at 7:16 am #

    I need a mathematical model for fraud detection.

  10. Avatar
    Nathan June 20, 2017 at 2:37 am #

    Jason Brownlee. very poor answer

    • Avatar
      Jason Brownlee June 20, 2017 at 6:40 am #

      Which answer and how so Nathan?

      • Avatar
        Anthony The Koala February 11, 2018 at 8:52 pm #

        Dear Dr Jason,
        I fully agree with you. These resources on this website are like ‘bare bones’. It is up to you to apply the model. The general concept of a confusion matrix is summarized in “2 class confusion matrix case study”, particularly the table at the end of the section. Follow from the beginning of the section.

        Since this is a 2 class confusion matrix, you have “fraud”/ “non-fraud” rows and columns instead of “men”/”women” rows and columns.

        There is a page at http://web.stanford.edu/~rjohari/teaching/notes/226_lecture8_prediction.pdf which talks about fraud detection and spam detection. Is it the bees-knees of study? I cannot comment but suffice to say don’t expect a fully exhaustive discussion of all the minutiae on webpages/blogs

        In addition, even though I have Dr Jason’s book “Machine Learning from Scratch”, I always seek ideas from this webpage.

        Anthony from exciting Belfield

  11. Avatar
    ALTAFF July 8, 2017 at 2:11 pm #

    nice explanation

  12. Avatar
    Sai July 18, 2017 at 5:25 am #

    Hi! Thank you for the great post!
    I have one doubt though……….For the 2 class problem, where you discussed about false positives etc should’nt false positive be the entry below true positive in the matrix?

  13. Avatar
    elahe August 16, 2017 at 4:53 pm #

    hi
    Is the confusion matrix defined only for nominal variables?

  14. Avatar
    Andre September 5, 2017 at 9:38 am #

    Is there anything like a confusion matrix also available for regression.
    There are deviations there too.

    • Avatar
      Jason Brownlee September 7, 2017 at 12:40 pm #

      No. You could look at the variance of the predictions.

  15. Avatar
    Chandana September 25, 2017 at 9:01 pm #

    Hi,
    I hope to get a reply soon. How do we compute confusion matrix for the multilabel multiclass classification case? Please give an example.
    As far as I understand:
    If
    y_pred = [1,1,0,0] and y_true = [0,0,1,1]; the confusion matrix is:

    C1 C2 C3 C4
    C1 0 0 0 0
    C2 0 0 0 0
    C3 1 1 0 0
    C4 1 1 0 0

    Is that right? If so, why is this a correct way to compute it (since we don’t know if class-4 is confused with class 1 or class 2, Same goes with the case of class-3)?

  16. Avatar
    Raktim October 21, 2017 at 11:52 pm #

    Hi Dr. Brownlee,
    In your given confusion matrix, False Positive and False Negative has become opposite. I got really confused by seeing that confusion matrix. Event that incorrectly predicted as no event should be False Negative on the other hand no-event that incorrectly predicted as event should be False Positive. Thats what I have learnt from the following reference.

    Waiting for your explanation.

    Reference: http://www.dataschool.io/simple-guide-to-confusion-matrix-terminology/
    Youtube Video: https://www.youtube.com/watch?v=4Xw19NpQCGA
    Wikipedia: https://en.wikipedia.org/wiki/Confusion_matrix

  17. Avatar
    Raktim October 23, 2017 at 12:14 am #

    Hi Dr Brownlee,

    “We can summarize this in the confusion matrix as follows:”

    After the above line the table is still there and showing the FP and FN in opposite way.

    Regards,
    Raktim

  18. Avatar
    Raktim October 26, 2017 at 12:23 am #

    Dear Sir,

    Will you please look at this because wiki has written opposite way? Therefore your table does not match.

    https://drive.google.com/open?id=0B8RkeH8XSyArWldzdjFGYW1teTA

  19. Avatar
    Robson Pastor Alexandre December 6, 2017 at 1:59 am #

    There’s an error in the accuracy’s formula.
    It’s:
    accuracy = 7 / 10 * 100

    Instead of:
    accuracy = 7 / 100 * 100

  20. Avatar
    Vishnu Priya January 28, 2018 at 10:04 pm #

    Hello!Could you please explain how to find parameters for multiclass confusion matrix like 3*3 order or more?

    • Avatar
      Jason Brownlee January 29, 2018 at 8:16 am #

      Sorry, what do you mean find parameters for a confusion matrix?

  21. Avatar
    Jemz February 21, 2018 at 1:00 pm #

    Could you please explain why confusion matrix is better than other for the evaluation model classification, especially for Naive Bayes. thankyou

    • Avatar
      Jason Brownlee February 22, 2018 at 11:14 am #

      It may be no better or worse, just another way to review model skill.

  22. Avatar
    alvi February 21, 2018 at 1:19 pm #

    Could you please explain why confusion matrix is good or recommended for evalution model ?

    • Avatar
      Jason Brownlee February 22, 2018 at 11:15 am #

      It can help you see the types of errors made by the model when making predictions. E.g. Class A is mostly predicted as class B instead of class C.

  23. Avatar
    Mukrimah March 13, 2018 at 1:02 pm #

    Hi Sir Jason Brownlee

    Do you have example of source code (java) for multi-class to calculate confusion matrix?
    Let say i have 4 class(dos, normal,worms,shellcode) then i want to make a confusion matrix where usually diagonal is true positive value. Accuracy by class(dos)= predicted dos/actual dos and so on then later on accuracy= all the diagonal (tp value)/ total number of instances

  24. Avatar
    Krishnaprasad Challuru March 17, 2018 at 10:48 pm #

    Concepts explained well but in the example, it is wrongly computed:

    Sensitivity should be = TPR = TP/(TP+FN) = 3/(3+2) = 0.6 and
    Specificity should be = TNR = TN/(TN+FP) = 4/(4+1) = 0.8.

    However Sensitivity is wrongly computed as 0.06667 and Specificity is wrongly computed as 0.75.

    • Avatar
      Jason Brownlee March 18, 2018 at 6:04 am #

      I do not believe there is a bug in the R implementation.

      • Avatar
        Luc G January 30, 2019 at 5:56 pm #

        If the ‘event’ is 1, then it should be:

        Sensitivity = TPR = TP/(TP+FN) = 3/(3+1) = 0.75 and
        Specificity = TNR = TN/(TN+FP) = 4/(4+2) = 0.06667

        The confusion comes because the ”Positive’ Class : 0′ in the R code. The ‘event’ should be specified in the command:

        results <- confusionMatrix(data=predicted, reference=expected, positive='1')

        In Python, you can use this code to find the values to put in the above formulas:

        tn, fp, fn, tp = confusion_matrix(expected, predicted).ravel()
        (tn, fp, fn, tp)

  25. Avatar
    Nipa March 23, 2018 at 5:43 pm #

    hi! i am working on a binary classification problem but the confusion matrix i am getting is something like
    [12, 0, 0],
    [ 1, 16, 0],
    [ 0, 7, 0]
    I don’t understand what does the 7 mean? can you please explain?
    N.B. It should be
    [13, 0],
    [0, 23]

  26. Avatar
    Nipa March 26, 2018 at 4:26 pm #

    Actually there is no bug in the code. The code works fine with other datasets.

    So I changed the target vector of the dataset from 2 to 3 and it works better now but the problem remains the same.

    Now it looks like this:
    [[17, 0, 0, 0],
    [ 0, 12, 0, 0],
    [ 0, 0, 8, 0],
    [ 0, 0, 0, 2]]
    Is it because the ANN could not link the 2 values (4th row) with any of the other classes?

  27. Avatar
    iamai May 31, 2018 at 6:24 am #

    There is a typo mistake:

    If
    men classified as women: 2
    woman classified as men: 1

    How can confusion matrix be:
    men women
    men 3 1
    women 2 4

    The correction:
    men classified as women: 1
    woman classified as men: 2

    • Avatar
      Jason Brownlee May 31, 2018 at 6:31 am #

      I believe it is correct, remember that columns are actual and rows are predicted.

      • Avatar
        Lindsay Peters July 18, 2018 at 2:27 pm #

        Weka seems to do the opposite. if you do a simple J48 classification on the Iris tutorial data, you get the following
        a b c <– classified as
        49 1 0 | a = Iris-setosa
        0 47 3 | b = Iris-versicolor
        0 2 48 | c = Iris-virginica
        where we know that there are actually 50 of each type. So for Weka's confusion matrix, the actual count is the sum of entries in a row, not a column. So I'm still confused!

        • Avatar
          Jason Brownlee July 18, 2018 at 2:49 pm #

          The meaning is the same if the matrix is transposed. It is all about explaining what types of errors were made.

          Does that help?

          • Avatar
            Lindsay Peters July 20, 2018 at 10:41 am #

            Yes that helps, thanks. Confirms that for the Weka confusion matrix, columns are predicted and rows are actual – the transpose of the definition you are using, as you point out. I hadn’t realised that both formats are in common use.

  28. Avatar
    hafez amad June 7, 2018 at 10:08 pm #

    thank you man! simple explanation

  29. Avatar
    Ibrar hussain July 18, 2018 at 4:37 pm #

    hy

    i am using Weka tool and apply DecisionTable model and get following confusion matrix

    any one Label it as a TP, TN, FP and FN

    Please help me

  30. Avatar
    Bilal Süt August 2, 2018 at 11:16 pm #

    Thank you for these website, i am an intern my superiors gave me some tasks about machine learning and a.ı and your web site helped me very well thanks a lot Jason

  31. Avatar
    Varad Pimpalkhute September 26, 2018 at 9:18 pm #

    Hi, can confusion matrix be used for a large dataset of images?

    • Avatar
      Jason Brownlee September 27, 2018 at 6:00 am #

      A confusion matrix summarizes the class outputs, not the images.

      It can be used for binary or multi-class classification problems.

  32. Avatar
    S.Khan November 18, 2018 at 3:11 am #

    hi Sir

    Amazing information

    Sir is there any machine learning method with which I can do analysis of Survey results.

    • Avatar
      Jason Brownlee November 18, 2018 at 6:43 am #

      Yes, s with a question you have about the data, then use the data and models to answer it.

  33. Avatar
    srivalli November 28, 2018 at 5:04 am #

    Very nice document , really useful for creating the test case.

  34. Avatar
    Doaa Mohammed December 24, 2018 at 12:55 am #

    Hi there, I need help.. I’m using weka and the spam base data set from UCI, and the used one of the meta classifiers which is the stacking classifier; which gave 60.59 % accuracy, but the essiue is the true positive TP and the false positive were 0.
    What does it mean?

    • Avatar
      Jason Brownlee December 24, 2018 at 5:30 am #

      Perhaps try other methods?
      Perhaps try transforming the data prior to modeling?
      Perhaps try alternate configurations of your algorithm?

  35. Avatar
    Anam March 7, 2019 at 3:24 am #

    Dear Jason, Thanks for an informative article.I have a query that in the given confusion matrix 0 value in FP cell is acceptable or not?

    [[ 8 9]
    [ 0 15]]

    Thanks in advance.

    • Avatar
      Jason Brownlee March 7, 2019 at 6:56 am #

      It depends on the goals of your project.

      • Avatar
        Baraka October 19, 2020 at 4:16 pm #

        please i have a question i run a code for classification problem
        i found good accuracy in training and testing data
        i use the confusion matrix but the clasification i found in confusion matrix for classification the number is few than the number of my dataset
        m question why? i should find the number of my sample in the confusion matrix by then determine the acual ad predicted value please respond me im confuse
        to know the your classification what we us e

        • Avatar
          Jason Brownlee October 20, 2020 at 6:22 am #

          Well done!

          The total count in the confusion matrix will match the total number of rows in the test set. If this is not the case, ensure you counted correctly in both cases.

  36. Avatar
    pRANGYA March 29, 2019 at 1:25 am #

    Hi Jason,

    It will be great if you could interpret the confusionMatrix() i.e.the below parameters.

    Accuracy : 0.7
    95% CI : (0.3475, 0.9333)
    No Information Rate : 0.6
    P-Value [Acc > NIR] : 0.3823

    Kappa : 0.4
    Mcnemar’s Test P-Value : 1.0000

    Sensitivity : 0.6667
    Specificity : 0.7500
    Pos Pred Value : 0.8000
    Neg Pred Value : 0.6000
    Prevalence : 0.6000
    Detection Rate : 0.4000
    Detection Prevalence : 0.5000
    Balanced Accuracy : 0.7083

    ‘Positive’ Class : 0

    • Avatar
      Jason Brownlee March 29, 2019 at 8:39 am #

      What problem are you having interpreting it yourself exactly?

  37. Avatar
    himagaran April 26, 2019 at 2:59 am #

    hello how can i visualize the confusion matrix info displayed in weka results, is it possible to generate the diagram just like python?

    • Avatar
      Jason Brownlee April 26, 2019 at 8:36 am #

      Weka will generate an ASCII confusion matrix that you can copy paste into your document.

  38. Avatar
    Aniket June 15, 2019 at 12:57 pm #

    Hi,
    What are counters in confusion matrix?

    • Avatar
      Jason Brownlee June 16, 2019 at 7:08 am #

      They are the count of the number of samples classified as each class.

      Does that help?

  39. Avatar
    Elshrif July 7, 2019 at 3:11 pm #

    Hi,
    If the dataset containing as positive and negative reviews. Can we identify Fake Positive Reviews Rate, Fake Negative Reviews Rate, Real Positive Reviews Rate and Real Negative Reviews Rate using a confusion matrix after applying sentiment classification algorithms on a dataset?

    • Avatar
      Jason Brownlee July 8, 2019 at 8:38 am #

      Yes, you could train a model to classify a given review as real or fake – whatever that means.

  40. Avatar
    subhash August 20, 2019 at 3:02 pm #

    can we change the positive class to 1 instead of 0 in confusion matrix

    • Avatar
      Jason Brownlee August 21, 2019 at 6:33 am #

      Sure, you can present the data any way you wish.

  41. Avatar
    Mike Kelly November 12, 2019 at 1:28 pm #

    It seems that there is no standard on how the predicted vs. reference values are represented in the rows and columns in the matrix. The carat docs and wikipedia have reference in the columns whereas many blogs show the opposite. I guess it doesn’t matter as long as you know what the library is doing. In your article though, you state:

    Expected down the side: Each row of the matrix corresponds to a predicted class.
    Predicted across the top: Each column of the matrix corresponds to an actual class.

    Is that correct? Shouldn’t it say:

    Expected down the side: Each row of the matrix corresponds to an actual class.
    Predicted across the top: Each column of the matrix corresponds to a predicted class.

    • Avatar
      Jason Brownlee November 12, 2019 at 2:08 pm #

      Yes, I have seen both ways and very angry people argue both sides. As long as it’s labeled, it’s okay by me.

  42. Avatar
    rahul December 7, 2019 at 11:41 pm #

    hi sir, thank you for such a wonderful explanation.
    but I have one doubt in between precision and recall.
    can you please explain me in any general example

  43. Avatar
    PRADEEP PANICKER February 15, 2020 at 6:29 pm #

    Explanation of CONFUSION MATRIX – So simply done !!! Beautiful !!! SUPERB !!!

    Thanks a LOT – since this is the basis to understand further MODEL PERFORMANCES.

    Please advise if you have similar literature on MODEL PERFORMAMNCES ??

  44. Avatar
    Rick Garibay February 19, 2020 at 9:03 am #

    Great article, thank you.

    In the Python confusion matrix example, you pass in the expected array followed by the predictions array:

    results = confusion_matrix(expected, predicted)

    When I was reviewing the results in the matrix, it seemed wrong, as I was expecting the following based on manually calculating each bucket in my head as follows:

    [TP , FP]
    [FN, TN]

    [[3 2]
    [1 4]]

    But, I ran your code as-is and go the same result as you did:

    [[4 2]
    [1 3]]

    This didn’t make sense to me because looking at the data set, there are clearly 3 TPs, 2 FPs; 1 FN and 4 TNs.

    So, I tried flipping the parameters to:

    results = confusion_matrix(predicted, expected)

    And now I have the results I expect.

    Are there different conventions for presenting a confusion matrix? Thanks.

  45. Avatar
    HSA February 25, 2020 at 11:58 pm #

    I plot confusion matrix of a classification model on unbalancing dataset the bias is zero labels and I got this plot

    https://files.fm/u/nynwed55

    then I plot the result of the same model but on dataset unbalanced and the bias label is one and I got this plot

    https://files.fm/u/ghqxhkx3

    after that, I plot it on the balanced dataset

    https://files.fm/u/v26g5mbs

    what I noticed is that the model tends to classify the bias label good, otherwise It does not classify well. is this a good thing there is something wrong?

    • Avatar
      Jason Brownlee February 26, 2020 at 8:22 am #

      I don’t understand. What do you mean exactly?

  46. Avatar
    HSA February 26, 2020 at 10:09 pm #

    Ok I have the first dataset who has unbalanced labels (0 for neutral,1 for hate) the number of 0 labels is much larger than 1 label, the confusion matrix is in the first link, the darker color is in the section of 0 is classified as 0, which means the model performs well on the label who has the bias (more) in the dataset, when I test on dataset with 1 is more than zero label the darker color also in the 1 label in second link.
    the third link for the balanced dataset
    NOW the model performance is acceptable or there is something wrong?

  47. Avatar
    ARUN KUMAR SHARMA March 11, 2020 at 1:48 am #

    Confusion Matrix Very nicely explained. I have a query, your expert help is required. I did binary prediction through XGboost model and when I obtain the confusion matrix, I get MacNemar Test with p<0.05. How should I interpret it? Is it showing a significant difference between the 1st model when the algorithm started and the last model when the algorithm stopped? If it is not so, then what is it actually telling or if yes, please share academic reference. I obtained the confusion matrix from the caret package in R.

    • Avatar
      Jason Brownlee March 11, 2020 at 5:25 am #

      A p<0.05 suggests the difference in sample means is probably real.

  48. Avatar
    Jana April 17, 2020 at 6:47 am #

    Hi Jason,

    Thanks for the wonderful post.
    Please clarify my doubt. I have plotted the confusion matrix for the test data belonging to two classes of an individual subject. I would like to know if there are multiple subjects, do I have to plot the confusion matrices for all of them individually?
    In a journal paper, If I have to project a single confusion matrix representing all the subjects (n=15), how should I go about?
    Kindly help me in this regard.
    Thanks in advance.

    • Avatar
      Jason Brownlee April 17, 2020 at 7:45 am #

      Thanks!

      A confusion matrix is used to compare the frequency of predicted classes to expected classes.

      If by “subject” you mean classes, then yes.

      • Avatar
        Jana April 17, 2020 at 7:33 pm #

        Dear Dr.Jason,

        Thanks for your swift reply!
        I mean an individual/a participant of a research study by ‘subject’. I have two classes.
        My question is how do I have to plot a single confusion matrix representing all the participants?
        Hope I have made my question clearer now.
        Thanks in advance.

        • Avatar
          Jason Brownlee April 18, 2020 at 5:46 am #

          Perhaps combine all of the outcomes together regardless of participant.

          • Avatar
            Jana April 18, 2020 at 6:25 am #

            Thanks so much, Dr.Jason.

          • Avatar
            Jason Brownlee April 18, 2020 at 6:42 am #

            You’re welcome.

  49. Avatar
    Schrodinger April 21, 2020 at 1:21 am #

    Hello Jason,

    I have one question related to TP, FP, FN, TN. it can only define with the binary classification( true or false) or it also can define in multi-class classification.

    • Avatar
      Jason Brownlee April 21, 2020 at 6:00 am #

      Typically, yes, but you can also define multiple classes as “positive” and “negative” in order to generalize the idea.

      • Avatar
        Schrodinger April 21, 2020 at 7:38 pm #

        What does it mean exactly? Can you clarify it a little bit more detail? Thanks

        • Avatar
          Jason Brownlee April 22, 2020 at 5:54 am #

          I don’t have a worked example sorry. Perhaps I will write a tutorial on this topic.

  50. Avatar
    Mitesh April 21, 2020 at 9:45 pm #

    Where will I get the code of confusionmatrix() function?? Step by step code in R..??

    • Avatar
      Jason Brownlee April 22, 2020 at 5:55 am #

      See the above example for a confusion matrix in R.

  51. Avatar
    shivanof April 28, 2020 at 8:45 am #

    Hello sir,
    what a problem if i do not obtain a total number of confusion mat. ( i used 1000 images for classification) but in confusion matrix result i only get about 300 for (TP, TN, FP AND FN).
    what is the reason of that?
    thanks

    • Avatar
      Jason Brownlee April 28, 2020 at 1:23 pm #

      The total predictions in the confusion matrix must match the total predictions made by the model.

      If the numbers do not match, perhaps there is a bug in your code.

      • Avatar
        shivanof May 1, 2020 at 9:37 am #

        if we split our dataset into (train and validation set)
        the output of confusion matrix depends on validation set? or what?
        thanks sir.

        • Avatar
          Jason Brownlee May 1, 2020 at 2:02 pm #

          Correct. The confusion matrix is calculated on predictions made for the hold out set not used during training.

          • Avatar
            Mark May 24, 2021 at 12:56 am #

            Hi, do we take prediction on the validation set for the confusion matrix just from the last epoch?

          • Avatar
            Jason Brownlee May 24, 2021 at 5:46 am #

            You can if you want.

  52. Avatar
    aaqib May 3, 2020 at 4:13 pm #

    is it possible to draw confusion matrix for one class???

  53. Avatar
    Ale May 27, 2020 at 6:16 am #

    Hello Jason,

    I hope you can enlighten me with this doubt:

    I have a multi-class problem of 9 classes, when I use logistic regression the accuracy score is 0.3. The predictor classifies apparently well when looking at the confusion matrix, but it has trouble defining which neighbor to choose (For example when actual value is class #3 it predicts classes 2 , 3 or 4) , same for the rest of the 9 classes. I think the accuracy score is too rigid for my problem, and that is why I am getting it too low . Do you think I should use other metric, which does not penalize so heavily or what would you recommend???

    Please let me know if I am not being clear.

    Thankyou for your help!

  54. Avatar
    ysohbi July 13, 2020 at 3:52 am #

    Hello Jason,

    suppose the case where the predict value is not a man nor a woman but just the silence.

    In this case it is a FN (False Negative).

    How do you represent this fact in the predictive list (not 1 and not 0).

    Thank you!

    Yassine

    PS: the problem can also occur in the case where there are multiple classes

    • Avatar
      Jason Brownlee July 13, 2020 at 6:07 am #

      It would be a three class problem: man, woman, unknown.

  55. Avatar
    Bindhu J S August 21, 2020 at 4:32 am #

    Can you please help me? I have generated a confusion matrix of a satellite image where the number of misclassified pixels are exactly the same in some other classes…Can it be happen

  56. Avatar
    Kevin November 10, 2020 at 2:38 am #

    Dear Dr. Jason, after performing a 10-fold stratified cross-validation for a binary classification, we will usually proceed to generate the confusion matrix. So, is it a best practice to combine all the 10 confusion matrices as a single confusion matrix for reporting, rather than just generating a single confusion matrix independently that I have seen widely in those Kaggle Notebooks?

    If it is best to combine all the 10 confusion matrix, should we calculate the average of these four metrics,True Negative, True positive, False Negative and False Positive, rather than summing them up? What is the notion behind the combination? Also, can you provide a hint on the Python coding for getting the average from the confusion_matrix() method for these 10 confusion matrix? Thank you.

    • Avatar
      Jason Brownlee November 10, 2020 at 6:46 am #

      CV and confusion matrix are not compatible.

      You either use CV to estimate model performance on unseen data or use a train/test split with a confusion matrix.

  57. Avatar
    Cemiloglu December 2, 2020 at 8:24 am #

    Hello Sir, fist of all thank you very much for this great explanation. I would like to ask you that is there the caret library in python? Because I need to calculate specificity and sensitivity. Please help me

  58. Avatar
    Mustafa December 16, 2020 at 1:36 am #

    Hello Sir.
    I have an image class consisting of 7 categories. (37000 train and 2800 test images) How can I get information about which images were incorrectly predicted as a result of the confusion matrix? For example

    image id – — predicted class —- real class
    1.jpg———– Class A—————Class B.

    Is there a way to do that?

    • Avatar
      Jason Brownlee December 16, 2020 at 7:52 am #

      Make predictions manually and inspect the results that were predicted incorrectly. The confusion matrix won’t help.

  59. Avatar
    Nisarg Patel February 20, 2021 at 7:26 am #

    tell me to two different scenarios where the confusion matrix works and don’t works

  60. Avatar
    Nisarg Patel February 20, 2021 at 10:38 am #

    yes I can not find the downside of confusion matric

    • Avatar
      Jason Brownlee February 20, 2021 at 1:17 pm #

      It cannot be used to summarise multiple runs, such as k-fold cross-validation.

      It is unreadable with more than 5-10 classes.

  61. Avatar
    fereshteh February 27, 2021 at 2:10 am #

    Dear Dr. Jason
    Is the confusion matrix formed only when the test data is executed?

    • Avatar
      Jason Brownlee February 27, 2021 at 6:06 am #

      You can calculate the confusion matrix on any dataset you like, most commonly it is the test set.

      • Avatar
        Mark May 23, 2021 at 10:30 pm #

        Hi, regarding @feereshteh’s question – is it better to create a confusion matrix in the last epoch of training in which we call the validation part (and to have one “if statement” in validation to check if that is the last epoch and then save predicted and expected values) or to save the final model of training and again push all dataset through it? Do you have a link about that regarding neural network training for multiclass classification in TensorFlow or Pytorch?
        thx

        • Avatar
          Jason Brownlee May 24, 2021 at 5:46 am #

          Same result, it does not matter.

          I prefer to save the model, load it and evaluate later.

  62. Avatar
    Bonface March 27, 2021 at 6:22 pm #

    Hello.
    I am using Weka to build my model but i keep getting a large confusion matrix (41×41) and i just want a 2×2 matrix. is there a way to reduce the matrix using weka to a 2×2 matrix ?
    kindly help if you can.

    • Avatar
      Jason Brownlee March 29, 2021 at 6:01 am #

      If you have 2 classes in your data, you will get a 2×2 confusion matrix.

      • Avatar
        Bonface April 2, 2021 at 3:38 am #

        so basically the size of a confusion matrix is based on the number of classes in your data ?.

  63. Avatar
    Bonface April 5, 2021 at 10:24 pm #

    thank you so much. You’ve really helped me a lot I was really struggling with this issue.

  64. Avatar
    loryn May 23, 2021 at 3:34 am #

    hi sir,
    I hope you can answer me as soon as possible. I want to comibine the result of multi-class confusion matrix
    my problem :

    malin1 malin2 malin3 malin4 benin
    53 5 2 3 0
    7 38 5 0 1
    4 6 54 2 11
    0 0 3 42 5
    0 3 8 3 444

    the resulti want to show is

    benin malin
    444 17
    14 219

    • Avatar
      Jason Brownlee May 23, 2021 at 5:25 am #

      Perhaps sum the malin?
      Perhaps change your class labels in the dataset?

  65. Avatar
    Francisco Galdos August 4, 2021 at 2:57 am #

    Hi,

    I have a classifier where I coded in an “unclassified” category for predictions that fall below a certain probability value. There reference data, however, does not have any instances in that category. When calculating the confusion matrix should I only calculate it based on the predictions that returned a value (i.e. that were NOT labeled as unclassified), or should I be including the unclassified category in the confusion matrix?

    Thank you for your help!

    • Avatar
      Jason Brownlee August 4, 2021 at 5:17 am #

      Tough question. I guess it is up to you and how you choose to evaluate the model / present performance to stakeholders.

  66. Avatar
    Jomy October 5, 2021 at 4:10 pm #

    When I go through the confusion matrix in Weka, what I understood is

    Each row of the matrix corresponds to a actual class.
    Each column of the matrix corresponds to an predicted class.

    Please clarify.

    • Avatar
      Adrian Tam October 6, 2021 at 10:28 am #

      You’re correct.

  67. Avatar
    javeria October 27, 2021 at 12:05 am #

    hello Juson Sir, hope you are doing well.

    I read the recommended files regarding the initial stage of Machine learning which is really very helpful to understanding the initial concepts.

    I have a question regarding the PLS-DA model. In the PLS-DA model, which information we get from the ROC curve. I really didn’t get the concept of the ROC curve exactly, please tell me about it,

    I am looking forward to your kind response.

    THANKS IN ADVANCE.

  68. Avatar
    rey July 8, 2022 at 6:05 am #

    Thanks for the article.
    What is a possible problem with using confusion matrix to calculate performance of a classification model?

Leave a Reply