1/14
Flashcards created from the lecture notes on Multiple Regression Analysis, focusing on the further issues discussed in econometric methods.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
What is the main advantage of using logarithmic functional forms in regression analysis?
They provide a convenient percentage/elasticity interpretation.
What happens to the slope coefficients of logged variables with respect to rescalings?
They are invariant to rescalings.
What problems can taking logs help to eliminate or mitigate in regression analysis?
It can help eliminate or mitigate problems with outliers and assist in securing normality and homoskedasticity.
What type of variables should not be logged?
Variables measured in units such as years or those measured in percentage points should not be logged.
What is required when constructing predictions after logging variables?
It is hard to reverse the log-operation.
What does the marginal effect of experience refer to in regression analysis?
It refers to the change in the dependent variable as experience changes.
Can the return to experience become negative after a certain number of years?
Not necessarily; it depends on the distribution of observations in the sample.
What complicates the interpretation of parameters in regression models?
Interaction effects complicate interpretation.
What do average partial effects summarize in nonlinear functional forms?
They describe the relationship between the dependent variable and each explanatory variable.
Why is a high R-squared not indicative of causality in regression analysis?
A high R-squared does not imply that there is a causal interpretation.
What does adjusted R-squared account for when adding new regressors?
It imposes a penalty for adding new regressors.
Why might one variable not be included in a regression model?
Controlling for too many factors may lead to incorrect conclusions.
What is a potential consequence of adding regressors to reduce error variance?
It may exacerbate multicollinearity problems.
How does adding uncorrelated variables benefit regression analysis?
Uncorrelated variables reduce error variance without increasing multicollinearity.
What is needed to predict y when log(y) is the dependent variable?
The assumption that the error term is independent of the regressors.