Looks like no one added any tags here yet for you.
SST Formula
SSE + SSR = SST
What is SSR
sum of squares regression (THAT IS EXPLAINED by the model)
What is SSE
sum of squared errors (THAT IS NOT EXPLAINED by the model)
R squared
coefficient of determination
R squared formula
SSR/SST
standard error formula
standard deviation/ square root of n
Margin of Error Formula
confidence level * Standard Error
dummy variable
A variable used to convey qualitative information in a regression model
The regression sum of squares (SSR)
The explained part of the variation in the dependent variable
Explained Variation formula
total variation - unexplained variation
r squared formula
(total variation - Unexplained Variation) / Total Variation
r squared (coefficient of determination)
The percentage of variation in the dependent variable that is explained by the independent variable
what does margin of error account for
Error due to random chance only nothing specific
standard error
the variability of the predicted y-values around the observed y-values
What can happen to R squared when adding a variable
adding a variable cannot decrease the R squared
What happens when r squared = 1
SSR = SST
only valid dummy variables
0 and 1
The coefficent of a variable after we run a regression
our estimation for the average impact of that independent variable on the dependent variable, all other variables held constant
SSE can never be
larger than SST (SSE is a part of SST)
Rsquared equation
[ (SST - SSE) / (SST) ] aka (SSR/SST)
The error term
the part of regression equation that accounts for all variables not included in the model
What is true about a multiple regression analysis
you can only have one dependent variable
The Y intercept
The value of Y when all the X's are Zero
The residual
the difference between an observed value and estimated value
Why do we square the error?
positive and negative errors could cancel each other out
SSR formula
Summation (((Y^ - Y(bar))^2)
What do we do with adjusted R squared?
We make determinations between models using adjusted R squared (pick the model with higher adjusted R squared)
What do we do with the sample size if we want to cut our margin or error to 1/3 of what it is?
Get a sample size 9 times as large
What do we do with a confidence interval
We get a range of values for what we believe we can find the population means for a given confidence level
Standard formula for a confidence interval
(point estimate) + and - Margin of error
What happens to margin of error when the level of confidence decreases?
The MOE becomes smaller (Za/2) (shrinks in the formula)
Multiple regression equation
The mathematical equation that explains how the dependent variable Y is related to several independent variables X1, X2 and the error term ε
In the model Y = β0 + β1 X + β2 D + β3 X*D + ε, the interaction variable causes
(The interaction variable is D)
a change in just slope
How is adjusted R squared different from regular R squared?
Unlike the regular 𝑅2, the adjusted-𝑅2 will decrease if the additional independent variable doesn't increase the amount of variation in the dependent variable
(ONLY ADJUSTED 𝑅2 can decrease if more variables are added)
standard deviation
is variance squared
When deciding between two models with a different number of independent variables
pick the one with the higher R squared
What happens when you increase the sample size?
- Decrease the standard of error (SE) or variability
- Increase the precision around the true mean
What is the possibility when you increase the same size
You increase the likelihood that a sample mean will fall within some given distance of the true mean
What is the advantage of minimizing the sum of the squared errors rather than just the sum of the errors?
Larger errors are given more weight than smaller ones
(squaring the errors puts a larger relative weight on the larger errors instead of averaging the absolute value of errors)
What is margin of error?
Variation in the random sample due to chance