1/19
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Regression Analysis
A statistical technique for finding the best-fitting straight line for a set of data, allowing predictions based on correlations.
Linear Equation
Y = bX + a, describes the linear relationship between two variables where Y is the dependent variable and X is the independent variable.
Correlation
Determines the relationship between two variables, assessing how they change in relation to each other.
Dependent Variable (DV)
The outcome variable that is being predicted or explained in a regression analysis.
Independent Variable (IV)
The variable that is manipulated or changed to observe its effect on the dependent variable.
Y-intercept (a)
The value of Y when X equals 0 in the linear equation Y = bX + a.
Slope Constant (b)
Indicates how much Y changes when X increases by one unit in the equation Y = bX + a.
Best Fit Line
The line through the data points that minimizes the distance from each point to the line, providing the best prediction of Y.
Standard Error of Estimate (SEE)
A measure of the accuracy of predictions in regression analysis, indicating the standard distance between the regression line and actual data points.
Coefficient of Determination (r²)
Proportion of variability in Y predicted by its relationship with X, indicating the strength of the relationship.
Simple Linear Regression
Predicts the dependent variable from one independent variable.
Multiple Linear Regression
Predicts the dependent variable from two or more independent variables.
Homoscedasticity
The assumption that the variance of residuals remains constant across the levels of the dependent variable.
No Multicollinearity
The assumption that independent variables are not too highly correlated with each other.
Unstandardized Coefficients (B)
Used in regression equations to predict Y scores; must have the same scale for comparability.
Standardized Coefficients (β)
Used to compare the relative strength of predictors in regression analysis.
Statistical Significance (p-value)
Indicates whether the results from a regression analysis are likely due to chance, typically assessed at threshold levels such as p < .05.
Residuals
The differences between the actual values and the predicted values in a regression model.
Durbin-Watson Test
A statistical test used to check the independence of residuals in regression analysis.
Shapiro-Wilk Test
A statistical test used to check the normality of residuals in regression analysis.