Multiple Linear Regression

0.0(0)
studied byStudied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/29

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 4:27 PM on 9/24/24
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

30 Terms

1
New cards

Simple linear regression

Has only one x and one y variable

2
New cards

Whats the difference btw multiple and

Differnece of number of variables we are including

3
New cards

Multiple linear regression

Has one y and two or more x variables

4
New cards
5
New cards
6
New cards

Example of simple linear regression

We predict rent based on square feet alone

7
New cards

Example of multiple linear regression

Predict rent based on square feet and age of the building

8
New cards

if more than one independent variable is to be used in the regression model, linear regression can be extended to

Multiple regression to accommodate to several independent variables for prediction

9
New cards

What happens when we add more variables

  • adding more predictive power to model

  • R and R2 will either stay the same or improve

  • Residuals are generally closer to 0

10
New cards

Adding more independent variables to a model tends to

Improve prediction

11
New cards

Residuals are closer to 0 because we minimuze

epsilon, random error.

12
New cards

Categorical variable

Type

13
New cards

Impact of carat weight on diamond price (regression results)

We expect price goes up by 5333.86 on average

14
New cards

Multiple regression analysis additional statement (IMPROTANT)

controlling for the effect of other variables ; keeping other variables constant ; ceteris paribus ( ALL THIS MEANS Incremental impact on that specific variable)

15
New cards

Estimation power of ANOVA test (on sldies)

Reject null hypothesis and conclude that at least one of the models variables is significant (significance is less than .05) (on the price of diamonds)

16
New cards

Is it approproiate to interpret? (analyzing each coefficient)

if p vall is greater than .5 and 0 is in the range

17
New cards

Adjust R square is a modified version of R-Squared that

has been adjusted for the number of predictors in the model

18
New cards

For multiple regression analysis we use

ADjusted R2

19
New cards

R2 indicates how well

actual data points fit a line but adjusts for the number of terms in a model

20
New cards

The adjusted R-Squared value increases only when the new term

improves the model fit more than would be expected by chance

21
New cards

The adjusted R-Squared value decreases when a

Predictor improves the model by less than expected

22
New cards

R squared value always increases when

The number of variables increases

23
New cards

R -Squared values will still increase if you add a useless

variable to the model

24
New cards

Adjusted R-Squared value will

never increase if you add a useless X-Variable to the model

25
New cards

Adjusted R-Squared is mainly used for

the model selectiom but not the R-Squared value is used

26
New cards
27
New cards

Multicollineraity is a statistical phenomenon that occurs when

Two or more predictor variables in a regression model are highly correlated

28
New cards

Consequences of Multicollinearity

makes it challenging to determine the individual effect of each variable on the dependent variable

29
New cards

Multicollinearity may appear that some variables are

not significant when they might be individually, and vice versa

30
New cards

If multicollinearity is present in a regression model, it does not necessarily invalidate the model but it can

affect the precision of coefficient estimates. May lead to unstable and unpredictable coefficient values