1/13
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Bias
Error from overly simplistic assumptions(too simple); model systematically
wrong.
Variance
Error from sensitivity to training data fluctuations; model too
complex.
Overfitting
Model learns training noise; poor generalization.
Underfitting
Model too simple to capture true patterns; poor training and test performance.
Bias-Variance Tradeoff
The tension between model simplicity (high bias) and flexibility (high variance).
Regularization
Technique to penalize model complexity to reduce overfitting
(e.g., L1/L2).
Cross-Validation
Technique to estimate model generalization by splitting data into multiple train/test folds.
Irreducible Error
Noise inherent in the data that no model can remove.
Model Complexity
The degree of flexibility in a model (e.g., depth of tree, number of
parameters).
Learning Curve
A plot of training/validation error vs. training set size; used to diagnose bias/variance.
Validation Set
Data held out from training to evaluate hyperparameter choices.
L1 Regularization
(Lasso)
Adds absolute value of weights to the loss; promotes sparsity.
L2 Regularization
(Ridge)
Adds squared weights to the loss; shrinks weights toward zero.
Dropout
Randomly zeroing neurons during neural network training to
reduce overfitting.