Simple linear regression involves one independent variable predicting a dependent variable.
The key components include:
Constant (A): Represents the predicted value of the dependent variable (Y) when the independent variable (X) is 0.
Beta (β): The slope of the regression line; indicates the change in Y for a one-unit increase in X.
In the context of TV exposure:
A = 52.15 means that with 0 hours of TV, the attention score is predicted to be 52.15.
β = -3.76 suggests for each additional hour of TV, the attention score decreases by 3.76 points.
Extends simple linear regression by including multiple independent variables.
Helps in understanding the unique and collective prediction capabilities of each independent variable.
Statistical Control: Refers to the ability of the regression model to isolate the effect of independent variables that may correlate with each other.
Helps to identify each independent variable's unique contribution to the dependent variable despite potential overlapping effects.
Predicting final grades using:
Independent Variable 1: Number of quizzes completed.
Independent Variable 2: Hours spent studying.
Addresses the correlation between the variables (quizzes and study hours).
Venn diagrams illustrate unique and shared effects:
Unique Effect: How much of the variability in the dependent variable can be attributed to one independent variable after controlling for others.
Shared Effect: Represents overlapping contributions from multiple independent variables to predict the dependent variable.
Examples:
Do all independent variables collectively predict the dependent variable?
Does IV1 predict DV after controlling for IV2?
Which predictor has the strongest effect?
Example studies using multiple regression:
Validating emotional intelligence and leadership skills in managing job performance.
Analyzing the impact of religious fundamentalism on prejudice.
Assumptions include independence of observations, normality of residuals, homoscedasticity, linearity, and checking for collinearity among independent variables.
Collinearity: Refers to high correlation between independent variables that complicates individual effect interpretation.
Use statistical tests and visualizations (like residual plots) to confirm assumptions are satisfied before interpreting regression outcomes.
Multiple regression outputs include:
Model as a Whole: Represents the effectiveness of the collective independent variables in predicting the dependent variable.
Predictor Effect: Focuses on individual predictors' contributions to the dependent variable after accounting for other variables.
Analysis includes determining whether predictors are statistically significant - typically assessed via p-values from regression outputs.
An overall model significance is typically checked using an F-test.
For predicting Y:
Y = A + (β1 * IV1) + (β2 * IV2)
Substitute to find specific outcomes based on variable values.
Multiple linear regression allows for a richer analysis than simple linear regression by accounting for multiple factors and their interactions.
It aids researchers in making more nuanced conclusions based on the interplay of various independent variables.