1/18
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
What is Confusion Matrix?
Summarizes the relationship between predictors and actual observations by displaying the predicted class in the columns and true class in the rows
What is the True Positive (TP) of the Confusion Matrix?
Both predicted class and true class are positive
What is False Positive (FP) of the Confusion Matrix?
Predicted class is positive, but true class is negative
What is True Negative (TN) of the Confusion Matrix?
Both predicted class and true class are negative
What is False Negative (FN) of the Confusion Matrix?
Predicted class is negative, but true class is positive
What Does Training Error Rate Indicate?
What percentage of observations in the training set have been incorrectly classified
What is the Null Classifier?
A naive classification where we predict the same outcome over all observations
What is Accuracy?
What percentage of observations in the training set have been correctly classified
What is Sensitivity?
Measures a model’s ability to correctly identify positive instances, calculated as the proportion of actual positives correctly identified
What is Specificity?
Measures a classification model's ability to correctly identify negative instances (e.g., healthy patients, non-fraudulent transactions) out of all actual negatives
What is the Receiver Operating Characteristics (ROC) Curve?
Shows the relationship between true positive rate and false positive rate for many thresholds
What is the ideal ROC shape?
Where is hugs the top left corner, indicating a high true positive rate and low false positive rate
What is the Area under the ROC Curve (ROC AUC)?
A popular way of summarizing the predictive ability of a model to estimate a binary variable
What does an ROC AUC of 1 indicate?
The model is a perfect fit
What Does an ROC AUC of 0.5 indicate?
The model has no predictive ability
What Does an AUC ROC of <0.5 indicate?
The model is negative predictive ability
What is the Precision Recall Trade-off?
The True Positive Rate (Recall) increases when false negatives decrease, and vice versa
What is an F1 Score?
A harmonic metric of both precision and recall
Why does a high F1 score require both recall and precision to be high?
It assigns more weight to lower values, thus it takes high values to bring up this score