What is a Confusion Matrix in Machine Learning

Make the Confusion Matrix Less Confusing.

A confusion matrix is a technique for summarizing the performance of a classification algorithm.

Classification accuracy alone can be misleading if you have an unequal number of observations in each class or if you have more than two classes in your dataset.

Calculating a confusion matrix can give you a better idea of what your classification model is getting right and what types of errors it is making.

In this post, you will discover the confusion matrix for use in machine learning.

After reading this post you will know:

  • What the confusion matrix is and why you need to use it.
  • How to calculate a confusion matrix for a 2-class classification problem from scratch.
  • How create a confusion matrix in Weka, Python and R.

Let’s get started.

  • Update Oct/2017: Fixed a small bug in the worked example (thanks Raktim).
  • Update Dec/2017: Fixed a small bug in accuracy calculation (thanks Robson Pastor Alexandre)
What is a Confusion Matrix in Machine Learning

What is a Confusion Matrix in Machine Learning
Photo by Maximiliano Kolus, some rights reserved

Classification Accuracy and its Limitations

Classification accuracy is the ratio of correct predictions to total predictions made.

It is often presented as a percentage by multiplying the result by 100.

Classification accuracy can also easily be turned into a misclassification rate or error rate by inverting the value, such as:

Classification accuracy is a great place to start, but often encounters problems in practice.

The main problem with classification accuracy is that it hides the detail you need to better understand the performance of your classification model. There are two examples where you are most likely to encounter this problem:

  1. When you are data has more than 2 classes. With 3 or more classes you may get a classification accuracy of 80%, but you don’t know if that is because all classes are being predicted equally well or whether one or two classes are being neglected by the model.
  2. When your data does not have an even number of classes. You may achieve accuracy of 90% or more, but this is not a good score if 90 records for every 100 belong to one class and you can achieve this score by always predicting the most common class value.

Classification accuracy can hide the detail you need to diagnose the performance of your model. But thankfully we can tease apart this detail by using a confusion matrix.

What is a Confusion Matrix?

A confusion matrix is a summary of prediction results on a classification problem.

The number of correct and incorrect predictions are summarized with count values and broken down by each class. This is the key to the confusion matrix.

The confusion matrix shows the ways in which your classification model
is confused when it makes predictions.

It gives you insight not only into the errors being made by your classifier but more importantly the types of errors that are being made.

It is this breakdown that overcomes the limitation of using classification accuracy alone.

How to Calculate a Confusion Matrix

Below is the process for calculating a confusion Matrix.

  1. You need a test dataset or a validation dataset with expected outcome values.
  2. Make a prediction for each row in your test dataset.
  3. From the expected outcomes and predictions count:
    1. The number of correct predictions for each class.
    2. The number of incorrect predictions for each class, organized by the class that was predicted.

These numbers are then organized into a table, or a matrix as follows:

  • Expected down the side: Each row of the matrix corresponds to a predicted class.
  • Predicted across the top: Each column of the matrix corresponds to an actual class.

The counts of correct and incorrect classification are then filled into the table.

The total number of correct predictions for a class go into the expected row for that class value and the predicted column for that class value.

In the same way, the total number of incorrect predictions for a class go into the expected row for that class value and the predicted column for that class value.

In practice, a binary classifier such as this one can make two types of errors: it can incorrectly assign an individual who defaults to the no default category, or it can incorrectly assign an individual who does not default to the default category. It is often of interest to determine which of these two types of errors are being made. A confusion matrix […] is a convenient way to display this information.

— Page 145, An Introduction to Statistical Learning: with Applications in R, 2014

This matrix can be used for 2-class problems where it is very easy to understand, but can easily be applied to problems with 3 or more class values, by adding more rows and columns to the confusion matrix.

Let’s make this explanation of creating a confusion matrix concrete with an example.

2-Class Confusion Matrix Case Study

Let’s pretend we have a two-class classification problem of predicting whether a photograph contains a man or a woman.

We have a test dataset of 10 records with expected outcomes and a set of predictions from our classification algorithm.

Let’s start off and calculate the classification accuracy for this set of predictions.

The algorithm made 7 of the 10 predictions correct with an accuracy of 70%.

But what type of errors were made?

Let’s turn our results into a confusion matrix.

First, we must calculate the number of correct predictions for each class.

Now, we can calculate the number of incorrect predictions for each class, organized by the predicted value.

We can now arrange these values into the 2-class confusion matrix:

We can learn a lot from this table.

  • The total actual men in the dataset is the sum of the values on the men column (3 + 2)
  • The total actual women in the dataset is the sum of values in the women column (1 +4).
  • The correct values are organized in a diagonal line from top left to bottom-right of the matrix (3 + 4).
  • More errors were made by predicting men as women than predicting women as men.

Two-Class Problems Are Special

In a two-class problem, we are often looking to discriminate between observations with a specific outcome, from normal observations.

Such as a disease state or event from no disease state or no event.

In this way, we can assign the event row as “positive” and the no-event row as “negative“. We can then assign the event column of predictions as “true” and the no-event as “false“.

This gives us:

  • true positive” for correctly predicted event values.
  • false positive” for incorrectly predicted event values.
  • true negative” for correctly predicted no-event values.
  • false negative” for incorrectly predicted no-event values.

We can summarize this in the confusion matrix as follows:

This can help in calculating more advanced classification metrics such as precision, recall, specificity and sensitivity of our classifier.

For example, classification accuracy is calculated as true positives + true negatives.

Consider the case where there are two classes. […] The top row of the table corresponds to samples predicted to be events. Some are predicted correctly (the true positives, or TP) while others are inaccurately classified (false positives or FP). Similarly, the second row contains the predicted negatives with true negatives (TN) and false negatives (FN).

— Page 256, Applied Predictive Modeling, 2013

Now that we have worked through a simple 2-class confusion matrix case study, let’s see how we might calculate a confusion matrix in modern machine learning tools.

Code Examples of the Confusion Matrix

This section provides some example of confusion matrices using top machine learning platforms.

These examples will give you a context for what you have learned about the confusion matrix for when you use them in practice with real data and tools.

Example Confusion Matrix in Weka

The Weka machine learning workbench will display a confusion matrix automatically when estimating the skill of a model in the Explorer interface.

Below is a screenshot from the Weka Explorer interface after training a k-nearest neighbor algorithm on the Pima Indians Diabetes dataset.

The confusion matrix is listed at the bottom, and you can see that a wealth of classification statistics are also presented.

The confusion matrix assigns letters a and b to the class values and provides expected class values in rows and predicted class values (“classified as”) for each column.

Weka Confusion Matrix and Classification Statistics

Weka Confusion Matrix and Classification Statistics

You can learn more about the Weka Machine Learning Workbench here.

Example Confusion Matrix in Python with scikit-learn

The scikit-learn library for machine learning in Python can calculate a confusion matrix.

Given an array or list of expected values and a list of predictions from your machine learning model, the confusion_matrix() function will calculate a confusion matrix and return the result as an array. You can then print this array and interpret the results.

Running this example prints the confusion matrix array summarizing the results for the contrived 2 class problem.

Learn more about the confusion_matrix() function in the scikit-learn API documentation.

Example Confusion Matrix in R with caret

The caret library for machine learning in R can calculate a confusion matrix.

Given a list of expected values and a list of predictions from your machine learning model, the confusionMatrix() function will calculate a confusion matrix and return the result as a detailed report. You can then print this report and interpret the results.

Running this example calculates a confusion matrix report and related statistics and prints the results.

There is a wealth of information in this report, not least the confusion matrix itself.

Learn more about the confusionMatrix() function in the caret API documentation [PDF].

Further Reading

There is not a lot written about the confusion matrix, but this section lists some additional resources that you may be interested in reading.

Summary

In this post, you discovered the confusion matrix for machine learning.

Specifically, you learned about:

  • The limitations of classification accuracy and when it can hide important details.
  • The confusion matrix and how to calculate it from scratch and interpret the results.
  • How to calculate a confusion matrix with the Weka, Python scikit-learn and R caret libraries.

Do you have any questions?
Ask your question in the comments below and I will do my best to answer them.


Want to Code Algorithms in Python Without Math?

Machine Learning Algorithms From Scratch

Code Your First Algorithm in Minutes

…with step-by-step tutorials on real-world datasets

Discover how in my new Ebook:
Machine Learning Algorithms From Scratch

It covers 18 tutorials with all the code for 12 top algorithms, like:
Linear Regression, k-Nearest Neighbors, Stochastic Gradient Descent and much more…

Finally, Pull Back the Curtain on
Machine Learning Algorithms

Skip the Academics. Just Results.

Click to learn more.


70 Responses to What is a Confusion Matrix in Machine Learning

  1. Vinay November 18, 2016 at 9:42 pm #

    Nice example. I have two single dimensional array:one is predicted and other is expected. It is not a binary classification. It is a five class classification problem. How to compute confusion matrix and true positive, true negative, false positive, false negative.

    • Jason Brownlee November 19, 2016 at 8:47 am #

      Hi Vinay, you can extrapolate from the examples above.

    • Avinash November 3, 2018 at 3:16 am #

      Hey Vinay did you got the solution for the problem ?? I’m facing the similar problem right now.

  2. Shai March 19, 2017 at 7:19 pm #

    Nice , very good explanation.

  3. Ananya Mohapatra March 31, 2017 at 9:45 pm #

    hello sir,
    Can we implement confusion matrix in multi-class neural network program using K-fold cross validation??

    • Jason Brownlee April 1, 2017 at 5:55 am #

      Yes, but you would have one matrix for each fold of your cross validation.

      It would be better method for a train/test split.

  4. pakperchum May 3, 2017 at 2:56 pm #

    Using classification Learner app of MATLAB and I obtained the confusion matrix, Can I show the classification results in image? how? Please guide

  5. shafaq May 3, 2017 at 2:58 pm #

    Using Weka and Tanagra, naive Bayes classification leads to a confusion matrix, How I can show the classification results in the form of image instead of confusion matrix?
    Guide please

  6. Shafaq May 6, 2017 at 3:40 pm #

    “Lena” noisy image taken as base on which noise detection feature applied after that matrix of features passed as training set. Now I want to take output in the form of image (Lena) but Tanagra and weka shows confusion matrix or ROC curve (can show scatter plot) through naive Bayes classification. Help plz

  7. cc May 8, 2017 at 8:50 pm #

    how to write confusion matrix for n image in one table

    • Jason Brownlee May 9, 2017 at 7:41 am #

      You have one row/column for each class, not each input (e.g. each image).

  8. Giorgos May 20, 2017 at 7:11 am #

    Hello Jason, I have a 3 and a 4 class problem, and I have made their confusion matrix but I cant understand which of the cells represents the true positive,false positive,false negative, in the binary class problem its more easy to understand it, can you help me?

  9. Amanze Chibuike May 28, 2017 at 7:16 am #

    I need a mathematical model for fraud detection.

  10. Nathan June 20, 2017 at 2:37 am #

    Jason Brownlee. very poor answer

    • Jason Brownlee June 20, 2017 at 6:40 am #

      Which answer and how so Nathan?

      • Anthony The Koala February 11, 2018 at 8:52 pm #

        Dear Dr Jason,
        I fully agree with you. These resources on this website are like ‘bare bones’. It is up to you to apply the model. The general concept of a confusion matrix is summarized in “2 class confusion matrix case study”, particularly the table at the end of the section. Follow from the beginning of the section.

        Since this is a 2 class confusion matrix, you have “fraud”/ “non-fraud” rows and columns instead of “men”/”women” rows and columns.

        There is a page at http://web.stanford.edu/~rjohari/teaching/notes/226_lecture8_prediction.pdf which talks about fraud detection and spam detection. Is it the bees-knees of study? I cannot comment but suffice to say don’t expect a fully exhaustive discussion of all the minutiae on webpages/blogs

        In addition, even though I have Dr Jason’s book “Machine Learning from Scratch”, I always seek ideas from this webpage.

        Anthony from exciting Belfield

  11. ALTAFF July 8, 2017 at 2:11 pm #

    nice explanation

  12. Sai July 18, 2017 at 5:25 am #

    Hi! Thank you for the great post!
    I have one doubt though……….For the 2 class problem, where you discussed about false positives etc should’nt false positive be the entry below true positive in the matrix?

  13. elahe August 16, 2017 at 4:53 pm #

    hi
    Is the confusion matrix defined only for nominal variables?

  14. Andre September 5, 2017 at 9:38 am #

    Is there anything like a confusion matrix also available for regression.
    There are deviations there too.

    • Jason Brownlee September 7, 2017 at 12:40 pm #

      No. You could look at the variance of the predictions.

  15. Chandana September 25, 2017 at 9:01 pm #

    Hi,
    I hope to get a reply soon. How do we compute confusion matrix for the multilabel multiclass classification case? Please give an example.
    As far as I understand:
    If
    y_pred = [1,1,0,0] and y_true = [0,0,1,1]; the confusion matrix is:

    C1 C2 C3 C4
    C1 0 0 0 0
    C2 0 0 0 0
    C3 1 1 0 0
    C4 1 1 0 0

    Is that right? If so, why is this a correct way to compute it (since we don’t know if class-4 is confused with class 1 or class 2, Same goes with the case of class-3)?

  16. Raktim October 21, 2017 at 11:52 pm #

    Hi Dr. Brownlee,
    In your given confusion matrix, False Positive and False Negative has become opposite. I got really confused by seeing that confusion matrix. Event that incorrectly predicted as no event should be False Negative on the other hand no-event that incorrectly predicted as event should be False Positive. Thats what I have learnt from the following reference.

    Waiting for your explanation.

    Reference: http://www.dataschool.io/simple-guide-to-confusion-matrix-terminology/
    Youtube Video: https://www.youtube.com/watch?v=4Xw19NpQCGA
    Wikipedia: https://en.wikipedia.org/wiki/Confusion_matrix

  17. Raktim October 23, 2017 at 12:14 am #

    Hi Dr Brownlee,

    “We can summarize this in the confusion matrix as follows:”

    After the above line the table is still there and showing the FP and FN in opposite way.

    Regards,
    Raktim

  18. Raktim October 26, 2017 at 12:23 am #

    Dear Sir,

    Will you please look at this because wiki has written opposite way? Therefore your table does not match.

    https://drive.google.com/open?id=0B8RkeH8XSyArWldzdjFGYW1teTA

  19. Robson Pastor Alexandre December 6, 2017 at 1:59 am #

    There’s an error in the accuracy’s formula.
    It’s:
    accuracy = 7 / 10 * 100

    Instead of:
    accuracy = 7 / 100 * 100

  20. Vishnu Priya January 28, 2018 at 10:04 pm #

    Hello!Could you please explain how to find parameters for multiclass confusion matrix like 3*3 order or more?

    • Jason Brownlee January 29, 2018 at 8:16 am #

      Sorry, what do you mean find parameters for a confusion matrix?

  21. Jemz February 21, 2018 at 1:00 pm #

    Could you please explain why confusion matrix is better than other for the evaluation model classification, especially for Naive Bayes. thankyou

    • Jason Brownlee February 22, 2018 at 11:14 am #

      It may be no better or worse, just another way to review model skill.

  22. alvi February 21, 2018 at 1:19 pm #

    Could you please explain why confusion matrix is good or recommended for evalution model ?

    • Jason Brownlee February 22, 2018 at 11:15 am #

      It can help you see the types of errors made by the model when making predictions. E.g. Class A is mostly predicted as class B instead of class C.

  23. Mukrimah March 13, 2018 at 1:02 pm #

    Hi Sir Jason Brownlee

    Do you have example of source code (java) for multi-class to calculate confusion matrix?
    Let say i have 4 class(dos, normal,worms,shellcode) then i want to make a confusion matrix where usually diagonal is true positive value. Accuracy by class(dos)= predicted dos/actual dos and so on then later on accuracy= all the diagonal (tp value)/ total number of instances

  24. Krishnaprasad Challuru March 17, 2018 at 10:48 pm #

    Concepts explained well but in the example, it is wrongly computed:

    Sensitivity should be = TPR = TP/(TP+FN) = 3/(3+2) = 0.6 and
    Specificity should be = TNR = TN/(TN+FP) = 4/(4+1) = 0.8.

    However Sensitivity is wrongly computed as 0.06667 and Specificity is wrongly computed as 0.75.

    • Jason Brownlee March 18, 2018 at 6:04 am #

      I do not believe there is a bug in the R implementation.

  25. Nipa March 23, 2018 at 5:43 pm #

    hi! i am working on a binary classification problem but the confusion matrix i am getting is something like
    [12, 0, 0],
    [ 1, 16, 0],
    [ 0, 7, 0]
    I don’t understand what does the 7 mean? can you please explain?
    N.B. It should be
    [13, 0],
    [0, 23]

  26. Nipa March 26, 2018 at 4:26 pm #

    Actually there is no bug in the code. The code works fine with other datasets.

    So I changed the target vector of the dataset from 2 to 3 and it works better now but the problem remains the same.

    Now it looks like this:
    [[17, 0, 0, 0],
    [ 0, 12, 0, 0],
    [ 0, 0, 8, 0],
    [ 0, 0, 0, 2]]
    Is it because the ANN could not link the 2 values (4th row) with any of the other classes?

  27. iamai May 31, 2018 at 6:24 am #

    There is a typo mistake:

    If
    men classified as women: 2
    woman classified as men: 1

    How can confusion matrix be:
    men women
    men 3 1
    women 2 4

    The correction:
    men classified as women: 1
    woman classified as men: 2

    • Jason Brownlee May 31, 2018 at 6:31 am #

      I believe it is correct, remember that columns are actual and rows are predicted.

      • Lindsay Peters July 18, 2018 at 2:27 pm #

        Weka seems to do the opposite. if you do a simple J48 classification on the Iris tutorial data, you get the following
        a b c <– classified as
        49 1 0 | a = Iris-setosa
        0 47 3 | b = Iris-versicolor
        0 2 48 | c = Iris-virginica
        where we know that there are actually 50 of each type. So for Weka's confusion matrix, the actual count is the sum of entries in a row, not a column. So I'm still confused!

        • Jason Brownlee July 18, 2018 at 2:49 pm #

          The meaning is the same if the matrix is transposed. It is all about explaining what types of errors were made.

          Does that help?

          • Lindsay Peters July 20, 2018 at 10:41 am #

            Yes that helps, thanks. Confirms that for the Weka confusion matrix, columns are predicted and rows are actual – the transpose of the definition you are using, as you point out. I hadn’t realised that both formats are in common use.

  28. hafez amad June 7, 2018 at 10:08 pm #

    thank you man! simple explanation

  29. Ibrar hussain July 18, 2018 at 4:37 pm #

    hy

    i am using Weka tool and apply DecisionTable model and get following confusion matrix

    any one Label it as a TP, TN, FP and FN

    Please help me

  30. Bilal Süt August 2, 2018 at 11:16 pm #

    Thank you for these website, i am an intern my superiors gave me some tasks about machine learning and a.ı and your web site helped me very well thanks a lot Jason

  31. Varad Pimpalkhute September 26, 2018 at 9:18 pm #

    Hi, can confusion matrix be used for a large dataset of images?

    • Jason Brownlee September 27, 2018 at 6:00 am #

      A confusion matrix summarizes the class outputs, not the images.

      It can be used for binary or multi-class classification problems.

  32. S.Khan November 18, 2018 at 3:11 am #

    hi Sir

    Amazing information

    Sir is there any machine learning method with which I can do analysis of Survey results.

    • Jason Brownlee November 18, 2018 at 6:43 am #

      Yes, s with a question you have about the data, then use the data and models to answer it.

  33. srivalli November 28, 2018 at 5:04 am #

    Very nice document , really useful for creating the test case.

Leave a Reply