Linear Regression v2 (Notes)

studied byStudied by 0 people
0.0(0)
learn
LearnA personalized and smart learning plan
exam
Practice TestTake a test on your terms and definitions
spaced repetition
Spaced RepetitionScientifically backed study method
heart puzzle
Matching GameHow quick can you match all your cards?
flashcards
FlashcardsStudy terms and definitions

1 / 22

encourage image

There's no tags or description

Looks like no one added any tags here yet for you.

23 Terms

1
Linear Regression
A method to model the relationship between a dependent variable (y) and one or more independent variables (x1, x2,…, xn).
New cards
2
Dependent Variable
The outcome variable that is being predicted or explained in a regression model, typically denoted as y.
New cards
3
Independent Variable
The input features used to predict the dependent variable in a regression model, typically denoted as x1, x2,…, xn.
New cards
4
Multiple Linear Regression
An extension of linear regression that uses multiple features to predict the dependent variable.
New cards
5
Mean Squared Error (MSE)

loss function that measures how well a regression model predicts target values, defined as the average of the squared differences between actual and predicted values.

New cards
6
Residuals

The differences between observed values and predicted values in a regression model

New cards
7
Gradient Descent
An optimization algorithm used to minimize a loss function by iteratively adjusting model parameters.
New cards
8
Objective Function

minimized in optimization problems, in the context of regression, referring to minimizing the sum of squared errors.

New cards
9
Ridge Regression
A linear regression technique that adds an L2 penalty to the loss function to prevent overfitting by shrinking coefficients.
New cards
10
Lasso Regression
A linear regression technique that adds an L1 penalty to the loss function, allowing for automatic feature selection by shrinking some coefficients to zero.
New cards
11
Elastic Net
A linear regression technique that combines both L1 and L2 penalties, useful for datasets with correlated predictors.
New cards
12
Regularization Parameter (λ)
Controls the strength of the penalty added to the loss function in Ridge and Lasso regressions.
New cards
13
Intercept (β0)
The expected value of the dependent variable when all independent variables are zero, represented as β0.
New cards
14
Coefficients (β1, β2, ... , βn)
The values that represent the influence of each independent variable on the dependent variable in a regression model.
New cards
15
Cross Validation

estimate the skill of machine learning models by dividing the dataset into training and test sets multiple times.

ensures model generalizes well and avoids overfitting

New cards
16
Feature Selection
The process of selecting a subset of relevant features for a regression model to improve its performance and interpretability.
New cards
17
Optimization Algorithm
A method or process used to minimize or maximize an objective function by adjusting parameters.
New cards
18
Learning Rate (α)
A hyperparameter that controls how much to change the model in response to the estimated error each time the model weights are updated.
New cards
19
Step-by-Step Workflow for Gradient Descent

initializing coefficients, computing predictions, calculating loss, computing gradients, and updating coefficients.

New cards
20
Sum of Squared Errors (SSE)
A measure of the total deviation of the predicted values from the actual values, which is minimized in regression.
New cards
21
Kink at Zero
An important feature of the L1 penalty in Lasso Regression that can lead to feature selection by forcing some coefficients to be exactly zero.
New cards
22
Impact of Alpha (α)
Refers to how changes in the regularization parameter affect the strength of penalization and model performance.
New cards
23
Sharp Corner in Lasso Regression
The feature of the L1 penalty that enables it to reduce some coefficients to zero, effectively performing feature selection.
New cards
robot