1/15
These flashcards cover key terms and concepts related to regression analysis and bivariate analysis, providing definitions and explanations crucial for understanding the material.
Name | Mastery | Learn | Test | Matching | Spaced | Call with Kai |
|---|
No analytics yet
Send a link to your students to track their progress
Regression Analysis
A statistical method used to describe the strength and conditions under which an independent variable is associated with a dependent variable.
Ordinary Least Squares (OLS)
A common form of regression that estimates the relationship between variables by minimizing the sum of squared differences between observed and predicted values.
Dependent Variable
The outcome variable in a regression analysis that is being predicted or explained.
Independent Variable
The predictor variable in regression analysis that is used to explain changes in the dependent variable.
Intercept (β0)
The expected value of the dependent variable when the independent variable is zero.
Slope Coefficient (βYX)
Indicates how much the dependent variable is expected to change when the independent variable increases by one unit.
Sum of Squared Errors (SSE)
The sum of the squared differences between observed values and the values predicted by the model.
R-squared (R²)
A statistic that indicates the proportion of the total variation in the dependent variable that can be explained by the independent variable.
Standard Error of the Estimate (s.e.)
A measure of the typical distance that observed values fall from the regression line.
p-value
Indicates the probability of observing an extreme t-statistic if the true coefficient were zero; a small p-value suggests a statistically significant relationship.
Confidence Interval (CI)
A range of plausible values for the population slope coefficient that gives an estimate of uncertainty around the coefficient.
Covariance
A measure that describes the direction of the relationship between two variables but not the strength of that relationship.
Correlation Coefficient (r)
A standardized measure that describes both the direction and magnitude of the linear relationship between two variables.
Best Fit Line
The line that minimizes the sum of squared differences between the observed data points and the line itself.
Statistical Inference
The process of using sample data to make generalizations or predictions about a population.
Pearson's r
A correlation coefficient that assesses the strength and direction of a linear relationship between two continuous variables.