offers data science lesson videos made simple!

Sign up or log in to Magoosh Data Science.

What Is a Confusion Matrix?

What is a Confusion Matrix? Dive in with me below as we explore the meaning and use of Confusion Matrices.

confusion matrix, confused techno geek - magoosh

Image by lassedesignen

Data Classification

One of the most well-known and well-studied problems in Machine Learning is that of data classification.

The problem is simple — we are given a set of input data, and we have a class associated with each data point. Now, we are given a new data point and must predict the class associated with it.

Example 1

Let’s take an example. Say that we have a bunch of emails. Some of them are spam emails. Others are not spam (called “ham”). So each email is a data point, and the classes associated are spam or ham. We can now train a Machine Learning-based classifier on this data.

We are given a new data point (a new email) and need to find out if it is spam or non-spam (ham) email. Clearly, this is a classification problem. More specifically, this is a binary classification problem.

Measuring the Accuracy of a Classifier

When scientists developed these classifiers, one obvious challenge they faced was how to measure the performance of a classifier. One obvious way is to measure accuracy — the number of new data points that the algorithm correctly classified.

For instance, if the algorithm was tested on 100 new data points, and the algorithm correctly classified 86 of them — then we know that the accuracy is 86%.

However, measuring performance of a classifier this way is not recommended. Consider the following example:

    Suppose we have testing data where out of the 100 test emails, 90 are non-spam and only 10 are spam. Now imagine if there was a classifier that always classifies every email it gets as non-spam email. If this classifier was run on the above 100 test emails, clearly it will classify 90 out of 100 emails correctly. Measuring accuracy tells us that this classifier is 90% accurate. However, this makes no sense since the classifier always classifies each email as non-spam!

So Now What? Enter Confusion Matrix!

Scientists therefore developed a more powerful way of measuring the performance of such classifiers — the Confusion Matrix. A Confusion Matrix is a matrix that measures the accuracy of a classifier in a more robust way. Here is an example of a Confusion Matrix of a classifier:

confusion matrix, example 1 - magoosh

As you can see, there are 12 + 3 = 15 spam emails, and 4 + 81 = 85 non-spam emails in the data set. The interpretation of the above matrix is as follows:

  • Of the 12 + 3 = 15 spam emails, 12 have been correctly classified as spam and the remaining 3 have been misclassified as non-spam.
  • Of the 4 + 81 = 85 non-spam emails, 4 have been misclassified as spam while 81 have been correctly classified as non-spam.

Ideal Classifiers

In an ideal classifier (a classifier which classifies perfectly without any mistake), the Confusion Matrix should look something like this:

confusion matrix, predictions - magoosh

Basically, all the 15 spam emails have been correctly identified as spam, and all the 85 non-spam emails have been correctly identified as non-spam. Let’s revisit our original Confusion Matrix that looks something like this:

confusion matrix, predictions - magoosh

We know that of the 15 spam emails, 12 have been correctly classified as spam. So basically, 12/(12 + 3) = 0.800 spam emails are correctly classified while 3/(12 + 3) = 0.200 are incorrectly classified.

Similarly, 4/(4 + 81) = 0.047 non-spam emails are incorrectly classified while 81/(4 + 81) = 0.953 non-spam emails are correctly classified. Replacing the absolute numbers with the above fractions, we will get an updated Matrix that looks like this:

confusion matrix, updated matrix - magoosh

A Few Numbers of Interest

Now, let’s call non-spam class 0, and spam class 1.

A few numbers are of interest here:

  • False positive rate: The false positive rate indicates the number of objects of class 0 that were incorrectly flagged as class 1. In this case, 0.047 is the false positive rate. Basically, 4.7% of the objects (emails) of class 0 (non-spam) were flagged as belonging to class 1 (spam).
  • True negative rate: The true negative rate indicates the number of objects of class 1 that were incorrectly flagged as belonging to class 0. In this case, 0.200 is the true negative rate. Basically, 20% of the spam emails (class 1) were not identified as spam by the classifier.

Identity Matrix

In an ideal classifier, the Confusion Matrix would look like the following:

confusion matrix, ideal classifier - magoosh

Basically, for an ideal classifier, the Confusion Matrix expressed as a fraction is an Identity Matrix. Here, 100% of spam emails were identified as spam and 100% of non-spam emails were identified as non-spam.

Clearly, the Confusion Matrix is a much more accurate performance indicator since it captures those cases where the data set is skewed — objects of a particular class are far more in number as compared to objects of other classes.

Confusion Matrix for Three Classes

The concept of Confusion Matrix is generic and can be extended to multiple classes also. Take a look at the Confusion Matrix for three classes:

confusion matrix, three classes - magoosh

As I hope you can see, the Confusion Matrix is a great tool to measure the performance of a classifier, be it a binary classifier or a multi-class classifier. It is far more accurate even when data sets are skewed, and is therefore the ideal measurement tool to evaluate the performance of a classifier.

Comments are closed.


Magoosh blog comment policy: To create the best experience for our readers, we will only approve comments that are relevant to the article, general enough to be helpful to other students, concise, and well-written! 😄 Due to the high volume of comments across all of our blogs, we cannot promise that all comments will receive responses from our instructors.

We highly encourage students to help each other out and respond to other students' comments if you can!

If you are a Premium Magoosh student and would like more personalized service from our instructors, you can use the Help tab on the Magoosh dashboard. Thanks!