1/9
Flashcards created for key concepts from the lecture on Multiple Regression Analysis: Estimation.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Ordinary Least Squares (OLS)
A method for estimating the parameters in a linear regression model by minimizing the sum of squared residuals.
Homoscedasticity
The assumption that the variance of the error terms is constant across all levels of the independent variables.
Unbiased Estimator
An estimator that, on average, returns the true value of the parameter being estimated.
Multiple Regression Model
A statistical technique that models the relationship between a dependent variable and multiple independent variables.
Gauss-Markov Theorem
A theorem stating that, under certain assumptions, the OLS estimator is the best linear unbiased estimator (BLUE) among all linear estimators.
Collinearity
A situation in which two or more independent variables in a regression model are highly correlated, making it difficult to estimate their individual effects.
Log-Transformation
A technique used to transform a variable into its logarithmic form, useful for modeling relationships that exhibit exponential growth.
Omitted Variable Bias
The bias that occurs in regression analysis when a relevant variable is left out of the model.
Semi-Elasticity
A measure of the percentage change in the dependent variable resulting from a one-unit change in the independent variable.
R-squared (R²)
A statistical measure that represents the proportion of the variance for a dependent variable that's explained by independent variables in a regression model.