MR Lecture Notes: Multiple Regression II (Partitioning variance, assumptions, and more)

0.0(0)
studied byStudied by 0 people
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/27

flashcard set

Earn XP

Description and Tags

Vocabulary flashcards covering key concepts from the MR lecture notes: partitioning variance, assumptions, diagnostics, and how to interpret MR output.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

28 Terms

1
New cards

Multiple Regression (MR)

A statistical method for predicting a dependent variable from two or more independent variables; extends simple regression and enables partitioning explained variance into unique and shared components, with interpretation via coefficients and diagnostics.

2
New cards

Partitioning variance

In MR, the total variance explained (R^2) is decomposed into unique variance (explained by each predictor after removing overlap with others) and shared variance (explained jointly by two or more predictors).

3
New cards

Unique variance

The portion of DV variance uniquely attributed to a given predictor, after accounting for the influence of all other predictors.

4
New cards

Shared variance

The portion of DV variance jointly explained by two or more predictors, not separable into individual unique components.

5
New cards

Zero-order correlation (rYk)

The Pearson correlation between predictor Xk and the dependent variable (DV) without controlling for other predictors.

6
New cards

Beta coefficient (β)

The standardized regression coefficient; the expected change in DV (in standard deviation units) per one standard deviation change in the predictor, holding other predictors constant.

7
New cards

Unstandardized regression coefficient (b)

The slope of the predictor in the regression equation; the change in DV (in its original units) per one unit change in the predictor, holding other predictors constant.

8
New cards

Semi-partial (part) correlation

The unique contribution of a predictor to DV after removing the predictor’s overlap with other predictors; the squared semi-partial equals the unique variance explained by that predictor.

9
New cards

Partial correlation

The correlation between DV and a predictor after removing the linear effects of the other predictors from both DV and the predictor.

10
New cards

Orthogonal IVs

Independent predictors with zero correlation; in MR, orthogonality means predictors do not share variance, so R^2 equals the sum of squared zero-order correlations with DV.

11
New cards

Independence (assumption)

Assumes each observation is independent from the others; violation (e.g., clustering) can bias standard errors and inflate Type I error.

12
New cards

Linearity (assumption)

The DV is a linear function of the IVs; assessed via residual plots and possibly polynomial terms if nonlinearity is present.

13
New cards

Homoscedasticity (assumption)

Constant variance of residuals across levels of the IVs; violation (heteroscedasticity) can affect standard errors and inference.

14
New cards

Normality of residuals

Residuals are approximately normally distributed around zero; checked with histograms or Q-Q plots; affects SEs and inference in some contexts.

15
New cards

Residuals

Prediction errors: the differences between observed DV values and those predicted by the regression model.

16
New cards

Loess line

Locally estimated scatterplot smoothing line used in residual plots to assess linearity; a flat horizontal line suggests linearity, a visible trend suggests nonlinearity.

17
New cards

Confidence interval (CI)

A range around a coefficient estimate expressing uncertainty; common 95% CI; if zero is not in the interval, the effect is considered statistically significant at p<.05.

18
New cards

95% CI interpretation

If the study were repeated many times, 95% of the CIs would contain the true population parameter; interval estimates help gauge precision and stability.

19
New cards

ANOVA in regression

Analysis of Variance for regression; partitions total variability into Regression (explained) and Residual (unexplained); tests whether the model explains a significant portion of variance.

20
New cards

F-statistic

Ratio used in ANOVA to test the overall significance of the regression model (mean square regression divided by mean square error).

21
New cards

R-squared (R^2)

Proportion of variance in the DV explained by the regression model (0 to 1).

22
New cards

Adjusted R-squared

R^2 adjusted for the number of predictors; penalizes adding predictors that do not improve model fit beyond what would be expected by chance.

23
New cards

Model Summary

Table summarizing R, R^2, adj.R^2, and Standard Error of the estimate; provides an overall view of model fit.

24
New cards

Standard error (SE) of a coefficient

The standard error of a regression coefficient; used to compute t-statistics and confidence intervals for that coefficient.

25
New cards

Intercept

The constant term in the regression equation; predicted DV when all IVs are zero.

26
New cards

Orthogonality in MR example

When predictors are uncorrelated, MR R^2 equals the sum of squared zero-order correlations with the DV (e.g., R^2 = rY1^2 + rY2^2 for two orthogonal predictors).

27
New cards

Collinearity

High correlation among IVs; can inflate standard errors and complicate interpretation; assessed with diagnostics like VIF.

28
New cards

Outliers

Observations with unusually large residuals that can disproportionately affect regression estimates; should be diagnosed and handled appropriately.