Looks like no one added any tags here yet for you.
Simple Linear Regression
A statistical technique used to model the relationship between a response variable and a single explanatory variable.
Linear Relationship
An assumption of simple linear regression that indicates the relationship can be represented by a straight line.
Regression Equation
Expressed as y = b0 + b1x, where y is the response variable, x is the explanatory variable, b0 is the intercept, and b1 is the slope.
Error Term (𝜖)
Accounts for deviations of observed values from the regression line, assumed to be normally distributed with a mean of zero.
Assumptions of Simple Linear Regression
Linearity, Independence, Normality, and Equal Variance (Homoscedasticity).
Ordinary Least Squares (OLS)
The method used to determine the regression line that minimizes the sum of the squared distances between observed data points and predicted values.
Coefficient of Determination (R²)
Measures the proportion of variance in the response variable explained by the model, ranging from 0 to 1.
Residuals
The differences between the observed values and predicted values.
ANOVA Table
A table that organizes the decomposition of variance into regression and error components.
Gauss-Markov Theorem
States that OLS estimators are the best linear unbiased estimators (BLUE) of the parameters of the simple linear regression model.
Hypothesis Test for Slope
A t-test is used to determine if the slope of the regression line is significantly different from zero.
Confidence Intervals
Intervals estimating the range of values for the slope and intercept parameters based on sampling distributions.
lm() Function in R
Used to fit linear models in R.
Limitations of Simple Linear Regression
Assumes a linear relationship, sensitive to outliers, and assumes the explanatory variable is fixed.
Predicted Value (̂𝑦)
The estimated value of the response variable based on the regression line.