1/18
These flashcards cover key concepts and methodologies related to multiple linear regression, providing a review of the main topics expected in your exam.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
How does regression analysis handle differences in means between groups?
Mean differences between groups can be represented with dichotomous predictor variables (dummy variables; coded 0/1).
What is a treatment contrast in regression analysis?
A treatment contrast compares 2 or more groups, with one serving as a reference group.
In a simple contrast with ANOVA for 2 groups, how is the dummy variable defined?
One dummy variable is used, where 0 represents the reference group and 1 represents the other group.
What does the intercept in a regression model represent?
The intercept represents the mean of the reference group.
How are mean differences represented between a reference group and another group in a dummy variable regression model?
The mean difference is represented by the slope of the dummy variable.
When there are m > 2 groups, how are dummy variables used in regression analysis?
m - 1 dummy variables are created, with the reference group valued at 0 across all dummy variables.
What do slope parameters represent when using dummy variables for groups?
Slope parameters represent the mean differences of the remaining groups compared to the reference group.
What is the role of the 'Sum contrasts' method in regression?
It allows for modeling of mean differences using the overall mean in the intercept and one group coded as -1.
How can regression analysis model any mean comparison?
It can integrate both dichotomous and polytomous predictors and test any contrast with appropriate coefficients.
What are the key assumptions of multiple linear regression?
What is the importance of residual analysis in model diagnostics?
Residual analysis checks the distribution and independence assumptions of the regression, revealing issues like linearity and homoskedasticity.
What method can be used to handle violations of homoskedasticity?
Use weighted least squares or bootstrap methods for parameter estimation.
What are forward selection and backward elimination methods?
Forward selection adds predictors one by one based on their significance; backward elimination starts with all predictors and removes them one by one.
What can result from using stepwise regression?
It may lead to suboptimal predictor selection and is generally not recommended for confirmatory analysis.
What does the term 'overfitting' refer to in regression analysis?
Overfitting occurs when the model explains variance due to random covariation of predictors rather than true relationships.
What is ΔR2 in regression analysis?
ΔR2 is the change in explained variance when adding predictors to the model, tested for significance.
How can you empirically decide which predictors to include in a regression model?
By using various techniques such as inclusion, hierarchical, or stepwise methods.
What is the purpose of using bootstrap methods in regression analysis?
To generate empirical distributions of regression parameters and compute confidence intervals and significance tests.
How does linearity relate to the interpretation of prediction intervals in regression?
Prediction intervals estimate the range within which future observations should fall, helpful in identifying outliers and ensuring model validity.