1/10
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
Omitted Variables
Leads to omitted variable bias, specification bias. Can be detected with an unexpected coefficient sign and no statistical significance
Irrelevant variable
Leads to increases variance (standard error) of estimated coefficients and decreases the adj. R² (how much the model can explain variance in the y)
Effect on Coefficient Estimates | Omitted Variable | Irrelevant Variable |
Bias | ||
Variance |
Effect on Coefficient Estimates | Omitted Variable | Irrelevant Variable |
Bias | Yes | No |
Variance | Decreases | Increases |
1. Theory: Is the variable’s place in the equation unambiguous and
theoretically sound?
2. t-Test: Is the variable’s estimated coefficient significant in the
expected direction?
3. Adjusted R²: Does the overall fit of the equation (adjusted for degrees of freedom) improve when the variable is added to the equation?
4. Bias: Do other variables’ coefficients change significantly when the variable is added to the equation?
- If all these conditions hold, the variable belongs in the equation
- If none of them hold, it does not belong
- The tricky part is the intermediate cases: use sound judgment
Imperfect Multicollinearity
1) Variances/Standard errors of estimates will increase
2) Computed t-scores will fall
3) Estimated coefficients of the multicollinear variable are sensitive
These make it hard to estimate the individual effect of a variable
Solutions to multicollinearity
1) Do nothing
2) Drop the redundant variable
3) Increase the sample size to reduce the variance of the estimated coefficients
4) Redefine the variable
Consequences of pure serial correlation
1) Pure serial correlation does not cause bias in the coefficient estimates
2) Serial correlation causes OLS to no longer be the minimum variance estimator
3) Serial correlation causes the OLS estimates of the standard error to be biased, leading to unreliable hypothesis testing.
2 ways to detect serial correlation
1) Observing a pattern in the residuals
2) Testing using the Durbin-Watson d-test and regression test
d-test detection assumptions
1) The regression includes an intercept term (constant)
2) The serial correlation is first-order in nature: where p is the coefficient and you is an error term
3) The regression model does not include a lagged dependent variable as an independent variable
Durbin-Watson: d-statistic
Extreme positive serial correlation d=0
Extreme negative serial correlation d=~4
No serial correlation d=~2
Hypothesis Testing
Beta hat, beta, standard error, T-Statistic, T-Critical Value, draw, decide