L5 Part 3: Classification Evaluation

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/18

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 6:38 PM on 4/8/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

19 Terms

1
New cards

What is Confusion Matrix?

Summarizes the relationship between predictors and actual observations by displaying the predicted class in the columns and true class in the rows

2
New cards

What is the True Positive (TP) of the Confusion Matrix?

Both predicted class and true class are positive

3
New cards

What is False Positive (FP) of the Confusion Matrix?

Predicted class is positive, but true class is negative

4
New cards

What is True Negative (TN) of the Confusion Matrix?

Both predicted class and true class are negative

5
New cards

What is False Negative (FN) of the Confusion Matrix?

Predicted class is negative, but true class is positive

6
New cards

What Does Training Error Rate Indicate?

What percentage of observations in the training set have been incorrectly classified

7
New cards

What is the Null Classifier?

A naive classification where we predict the same outcome over all observations

8
New cards

What is Accuracy?

What percentage of observations in the training set have been correctly classified

9
New cards

What is Sensitivity?

Measures a model’s ability to correctly identify positive instances, calculated as the proportion of actual positives correctly identified

10
New cards

What is Specificity?

Measures a classification model's ability to correctly identify negative instances (e.g., healthy patients, non-fraudulent transactions) out of all actual negatives

11
New cards

What is the Receiver Operating Characteristics (ROC) Curve?

Shows the relationship between true positive rate and false positive rate for many thresholds

12
New cards

What is the ideal ROC shape?

Where is hugs the top left corner, indicating a high true positive rate and low false positive rate

13
New cards

What is the Area under the ROC Curve (ROC AUC)?

A popular way of summarizing the predictive ability of a model to estimate a binary variable

14
New cards

What does an ROC AUC of 1 indicate?

The model is a perfect fit

15
New cards

What Does an ROC AUC of 0.5 indicate?

The model has no predictive ability

16
New cards

What Does an AUC ROC of <0.5 indicate?

The model is negative predictive ability

17
New cards

What is the Precision Recall Trade-off?

The True Positive Rate (Recall) increases when false negatives decrease, and vice versa

18
New cards

What is an F1 Score?

A harmonic metric of both precision and recall

19
New cards

Why does a high F1 score require both recall and precision to be high?

It assigns more weight to lower values, thus it takes high values to bring up this score