1/39
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
What are the 5 assumptions in a Regression
True Model
Variables
Specification
Measurement
Error Term
True Model
This is an assumption that is solved by theory and focuses on including all relevant variables while excluding all irrelevant variables
Who do you solve True Model
Theory
What are the dependent variables in a regression assumption
Interval Ratio
What are the Independent Variables in a regression Assumptions
Interval, Ratio, Dummy, Dichotomous, Binary, this allows for nominal and ordinal variables to be added as long as they are coded 0 and 1
What are the problems with variables in Regression
Perfect Multicollinearity and Truncation
Perfect Multicollinearity
occurs when independent variables are perfectly correlated, meaning one variable is a linear combinations of others - fixed by theory
Truncation
When observations are dropped from your dataset because they fall above or below a certain threshold, meaning that portion of the population is entirely invisible to your analysis - fixed by theory
Specification
The relationship between X and Y in a regression is assumed to be linear. However, a line is not always the best representation of a relationship; we need to trick the regression to make a non-straight line. Omitting relevant variables leads to changes in the beta and t-value, which leads to a change in the line. Including irrelevant variables creates noise and error. You fix this error by squaring a term to fix the line and you know to do this by your theory
How do you fix Specfication
Square a term
How do you know you have Specification
Theory
Measurement
Make sure that the way variables are being interpreted and understood is representative of the theory
How do you know your measurement is correct
Theory
What violates the Expected Value of Error Term
When relevant, independent variables are omitted
How does a violation of the expected value of error term impact the regression
Screws up the intercept, which then leads to the interpretation of the hypothesis as wrong. If the expected value is greater than zero, it will have a positive bias to the intercept; if it is less than zero, it has a negative bias to the intercept
What is the solution to the violation of the expected value of error term
Including all necessary independent variables will fix the graph which relay on theory
What violates the Covariance Between Independent Variables and the Error Term
This error term appears when independent variables are correlated with the error term
How does a violation of the Covariance between Independent Variables and the Error terms impact the regression
The consequences include biased coefficients and unreliable hypothesis this is because omitting a correlated variable creates an effect on the beta, while omitting a slightly correlated variable affects the alpha and beta
How do you fix the covariance between the independent variables and the error term
You must relay on theory to include all relevant variables
What violated the variance of the error term (Homodcedasticity)
when the variance of the error term is not constant across observations
What impact does Variance of the Error Term (Homoscedasticity) have on the regression assumption
Residuals are not randomly scattered, which can bias the estimation of the standard error
How can you fix Variance of the Error Term (Homoscedasticity)
You first need to run a het test to see if Homoscedasticity is present, if it is present, then you need to run a robust test to fix the distributions of the standard errors
When does a violation in Covariance of Error Terms (Autocorrelation) happen
occurs when error terms are correlated with themselves. This specifically happens with time series models.
How does covariance among error terms impact a regression model
Impacts the hypothesis test, t-statistics, and standard error, making it unreliable
How do you test for Covariance of error terms
Use the Durbin-Watson Test, which ranges from 0 to 4. A value near 2 indicates no autocorrelation, values near 0 indicate positive autocorrelation, and values near 4 indicate negative autocorrelation.
How do you fix a Covariance of Error Terms
Include a dependent lagged variable to adjust the model so that error term becomes uncorrelated
Multiple Regression
How the dependent variable changes based on multiple independent variables along with a random error term.
How to fix multiple regression
reduce error term by using theory to ensure that all relevant variables are included and irrelevant ones are excluded
What Variables for Chi-Square
Nominal Ordinal
What do you report as signficant in the Chi Square
Chi Square
How do you interpret a relationship in Chi-Square
Significant- Dependent
Not Significant - Independent
What Variables are used in ANOVA
Dependent: Interval Ratio
Independent: Nominal Ordinal
What do you report as significant in ANOVA
F vlaue
What do your report in and ANOVA (before interpreting)
If the variables are different from each other
What variables can be used in a T-Test
Dependent: Interval Ratio
Independent: Nominal Ordinal
What do you report before analyzing the t-value
You chose which test you will be analyzing: less than, different, or more than
What are the two main aspects of a t-test interpretation
t- value and different
What variables can be used in a regression
Dependent: Interval Ratio
Independent: Interval ratio, dummy, dichotomous, binary
What do you have to do before analyzing a f-value in a regression
Perform a Het test to test for homoscedasticity, and if present, then use a robust test to correct the distribution of the standard errors
What are the three main parts of a regression interpretation
R2 or Adjusted R2, F value, constant comparison