Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusion Matrix: TP, TN, FP, FN #37

Open
sophryu99 opened this issue Jan 27, 2022 · 1 comment
Open

Confusion Matrix: TP, TN, FP, FN #37

sophryu99 opened this issue Jan 27, 2022 · 1 comment

Comments

@sophryu99
Copy link
Owner

sophryu99 commented Jan 27, 2022

Confusion Matrix

A confusion matrix describes the performance of the classification model. In other words, confusion matrix is a way to summarize classifier performance. The following figure shows a basic representation of a confusion matrix:

Screen Shot 2022-01-27 at 1 54 21 PM

It’s based on the fact that we need to compare the class predicted by the classifier with the actual class for each observation.
  • True Negative: The predicted class is negative and coincides with the actual case, which is negative too. In the example, we predicted the class 0 and the actual class of that observation is 0! This is a correct prediction.
  • True Positives: The predicted class is the same as the actual class. In the example, the predicted value is 1 and coincides with the actual class of that particular observation.
  • False Negatives: the predicted class is negative but doesn’t coincide with the actual class, which is instead positive. In the example, the predicted value is 0, but the actual class of that observation is 1! So, the prediction is wrong.
  • False Positives: The predicted class is positive, but the actual class is negative. In the example, the predicted class is 1 and the actual class of that observation is 0. The prediction is again wrong!

If the predicted val and the actual value are the same, it is True, else False
If the predicted val is 1, it is Positive, else Negative

>>> tn, fp, fn, tp = confusion_matrix([0, 1, 0, 1], [1, 1, 1, 0]).ravel()
>>> (tn, fp, fn, tp)
(0, 2, 1, 1)
@sophryu99
Copy link
Owner Author

False Discovery Rate(FDR):

  • proportion of all ‘discoveries’ which are false.
  • any time a null hypothesis is rejected it can be considered a ‘discovery’

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant