Multiple Linear Regression Concepts and Methods

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/18

flashcard set

Earn XP

Description and Tags

These flashcards cover key concepts and methodologies related to multiple linear regression, providing a review of the main topics expected in your exam.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

19 Terms

1
New cards

How does regression analysis handle differences in means between groups?

Mean differences between groups can be represented with dichotomous predictor variables (dummy variables; coded 0/1).

2
New cards

What is a treatment contrast in regression analysis?

A treatment contrast compares 2 or more groups, with one serving as a reference group.

3
New cards

In a simple contrast with ANOVA for 2 groups, how is the dummy variable defined?

One dummy variable is used, where 0 represents the reference group and 1 represents the other group.

4
New cards

What does the intercept in a regression model represent?

The intercept represents the mean of the reference group.

5
New cards

How are mean differences represented between a reference group and another group in a dummy variable regression model?

The mean difference is represented by the slope of the dummy variable.

6
New cards

When there are m > 2 groups, how are dummy variables used in regression analysis?

m - 1 dummy variables are created, with the reference group valued at 0 across all dummy variables.

7
New cards

What do slope parameters represent when using dummy variables for groups?

Slope parameters represent the mean differences of the remaining groups compared to the reference group.

8
New cards

What is the role of the 'Sum contrasts' method in regression?

It allows for modeling of mean differences using the overall mean in the intercept and one group coded as -1.

9
New cards

How can regression analysis model any mean comparison?

It can integrate both dichotomous and polytomous predictors and test any contrast with appropriate coefficients.

10
New cards

What are the key assumptions of multiple linear regression?

  1. Metric or dichotomous independent variable, metric dependent variable; 2. Linearity; 3. Homoskedasticity; 4. Normality of residuals; 5. Independence of observations; 6. No high correlation among predictors; 7. Additivity; 8. Correct model specification.
11
New cards

What is the importance of residual analysis in model diagnostics?

Residual analysis checks the distribution and independence assumptions of the regression, revealing issues like linearity and homoskedasticity.

12
New cards

What method can be used to handle violations of homoskedasticity?

Use weighted least squares or bootstrap methods for parameter estimation.

13
New cards

What are forward selection and backward elimination methods?

Forward selection adds predictors one by one based on their significance; backward elimination starts with all predictors and removes them one by one.

14
New cards

What can result from using stepwise regression?

It may lead to suboptimal predictor selection and is generally not recommended for confirmatory analysis.

15
New cards

What does the term 'overfitting' refer to in regression analysis?

Overfitting occurs when the model explains variance due to random covariation of predictors rather than true relationships.

16
New cards

What is ΔR2 in regression analysis?

ΔR2 is the change in explained variance when adding predictors to the model, tested for significance.

17
New cards

How can you empirically decide which predictors to include in a regression model?

By using various techniques such as inclusion, hierarchical, or stepwise methods.

18
New cards

What is the purpose of using bootstrap methods in regression analysis?

To generate empirical distributions of regression parameters and compute confidence intervals and significance tests.

19
New cards

How does linearity relate to the interpretation of prediction intervals in regression?

Prediction intervals estimate the range within which future observations should fall, helpful in identifying outliers and ensuring model validity.