1/26
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
Regression Analysis
Statistical method to examine the strength and direction of relationships between variables.
simple (bivariate) regression
Regression with one independent variable and one dependent variable.
multiple regression
Regression with two or more independent variables predicting a single dependent variable.
Beta coefficient
How strong each predictor is compared to the others.
Bigger beta = stronger predictor.
Coefficient of Determination (r²)
Proportion of variance in the dependent variable explained by the model. The percentage of your outcome that your model actually explains."Oh, so my model explains 30% of what's going on. The other 70% is chaos."
Covariation
Two variables change together; needed for correlation but not necessarily causation.
composite variable
a single variable created by combining two or more individual variables, often by adding, averaging, or multiplying them
Confirmatory composite analysis (CCA)
A statistical check to confirm whether the items in your composite variable actually fit together the way you thought they did.
Curvilinear relationship
a type of non-linear association between two variables where the relationship changes direction. For example, as one variable increases, the other may first increase and then decrease (an inverted-U shape)
Linear relationship
A straight-line relationship between variables. As X increases, Y increases or decreases at a consistent rate.
Scatter diagram
Visual display of dots, showing the relationship between two variables.
Pearson correlation coefficient
Measures Continuous, normally distributed data in a straight line.
Spearman rank order correlation coefficient
Measures for ranked or non-normal data. Good when your data is messy or not linear.
Unexplained variance
The part of the outcome that your model does not account for. The remaining randomness.
covariation vs correlation
covariation: Two variables change together in any pattern.
Correlation: A specific type of covariation that measures the strength and direction of a linear relationship.
Least squares procedure
The method regression uses to draw the line that minimizes the sum of squared errors. Basically: "Pick the line that misses the least."
Ordinary least squares (OLS)
The standard, default method for estimating regression coefficients using the least squares approach.
Regression coefficient
"How much does Y change when X changes?" The number that tells X how powerful it is.
Model F statistic
A big "Is this whole model even doing anything??" test.
Homoskedasticity
When the spread of errors in your regression model is roughly the same across all levels of X. Nice, even scatter.
Heteroskedasticity
When the spread of errors changes depending on X. The scatter gets bigger or smaller — not even.
Normal curve
The famous bell-shaped distribution where most scores are in the middle and fewer are at the extremes.
Multicollinearity
When predictor variables in a regression are highly correlated with each other, making it hard to tell which one is doing the predicting.
Partial least squares (PLS)
A type of structural modeling good for small samples, messy data, or when variables don't meet strong statistical assumptions.
Structural Equation Modeling
SEM is a statistical method that lets you test a bunch of relationships between variables all at the same time.It combines factor analysis (measuring constructs) and regression (predicting stuff) into one big model.h
Structural Model
This is one half of SEM.
The structural model shows:How the big concepts (constructs) relate to each other.
Basically the "cause-and-effect map" of the theory you're testing. Example: Satisfaction → Loyalty → Word-of-mouth
Measurement Model
The other half of SEM. The measurement model shows how well your survey items measure the constructs they're supposed to measure.
In other words:Do your questions actually represent the idea (construct) you claim they measure?