Created
December 11, 2018 10:23
-
-
Save debonx/5a5a3f86f309abcf9a5b4236834c36e0 to your computer and use it in GitHub Desktop.
Accuracy, Recall, Precision and F1 score with sklearn.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# To be reminded | |
# 1) Classifying a single point can result in a true positive (truth = 1, guess = 1), a true negative (truth = 0, guess = 0), a false positive (truth = 0, guess = 1), or a false negative (truth = 1, guess = 0). | |
# 2) Accuracy measures how many classifications your algorithm got correct out of every classification it made. | |
# 3) Recall measures the percentage of the relevant items your classifier was able to successfully find. | |
# 4) Precision measures the percentage of items your classifier found that were actually relevant. | |
# 5) Precision and recall are tied to each other. As one goes up, the other will go down. | |
# 6) F1 score is a combination of precision and recall. | |
# 7) F1 score will be low if either precision or recall is low. | |
from sklearn.metrics import accuracy_score, recall_score, precision_score, f1_score | |
labels = [1, 0, 0, 1, 1, 1, 0, 1, 1, 1] | |
guesses = [0, 1, 1, 1, 1, 0, 1, 0, 1, 0] | |
print(accuracy_score(labels, guesses)) | |
print(recall_score(labels, guesses)) | |
print(precision_score(labels, guesses)) | |
print(f1_score(labels, guesses)) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment