1/8
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
List the benefits of omitting a RHS model when trying to address multicollinearity
By omitting a RHS variable, we may reduce multicollinearity and lower standard errors
List a cost of omitting a RHS model when trying to address multicollinearity
Can cause bias in the remaining coefficients and lead to an incorrect model
Signs of multicollinearity prior to running auxiliary regression
High R²
One variable is statistically insignificant
One variable is statistically significant
ei hat formula?
yi-y-hat
y-hat formula
b1+b2xi+bkxk
(if more than 2 variables, extend it to however much there is)
After an auxiliary regression is run, what do we look at to determine multicollinearity?
If R² is high
F-hat is high
T-hat is significant
Then the variables are highly correlated and could be a source of multicollinearity
Possible sign of multicollinearity (not 100% sure until run auxiliary regression)
High R²
One individual variable is significant
One individual variable is insignificant
Benefit of acquiring additional data and/or new sample to address MC
Getting more data can reduce MC by increasing variation among the explanatory variables which makes estimates more precise
Cost of acquiring additional data and/or new sample to address MC
Data can be costly, and it may not fix MC if patterns don’t change