1/49
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Independent samples t-test
Use this when comparing the means of two different groups to see if they significantly differ (e.g., test scores of Group A vs. Group B).
Paired samples (related samples) t-test
Use this when comparing the means from the same group measured at two different times or under two conditions (e.g., pre-test vs. post-test scores of the same participants).
One-way ANOVA
Use this when comparing the means of three or more independent groups to determine if at least one group differs significantly from the others.
Post-hoc tests (after ANOVA)
Use these when ANOVA results are significant, to determine exactly which group means are different from each other.
Correlation
Use this when examining the strength and direction of the relationship between two continuous variables (e.g., hours studied and test score).
Regression
Use this when predicting the value of one variable based on the value of another (or others), especially to assess the effect of predictors on an outcome.
Chi-square goodness-of-fit
Use this to test whether the observed distribution of a categorical variable matches an expected distribution.
Chi-square test of independence
Use this to determine whether there is an association between two categorical variables in a contingency table.
p-value
The p-value represents the probability of observing your results (or more extreme) if the null hypothesis were true.
Statistically significant result
If p < .05, the result is statistically significant, meaning there is likely a real effect or difference.
Not significant result
If p ≥ .05, the result is not significant, meaning any observed difference may be due to chance.
Test statistic value
SPSS reports the test statistic value depending on the test: t for t-tests, F for ANOVA, r for correlation, χ² (Chi-square) for chi-square tests, B (or beta) for regression.
Raw means
Check the Descriptives table in SPSS. Look under the 'Mean' column for each group or condition to understand average performance/scores.
Degrees of freedom (df)
Degrees of freedom (df) help define the shape of the statistical distribution and are reported with the test statistic (e.g., t(28), F(2, 45)).
Effect size
Include effect size if available (Cohen's d, eta-squared, R²) to interpret the magnitude of the effect.
Illusory correlation
Thinking two things go together when they don't. Example: Believing people act strange during full moons.
Third-variable problem
A hidden factor affects both things. Example: Ice cream sales and drowning rise in summer because it's hot.
Independent variable
What you change.
Dependent variable
What you measure or observe.
Significance in context
State whether the result was significant or not and what that means in context.
SPSS output interpretation
Explain the results of the test.
Mean column in SPSS
Look under the 'Mean' column for each group or condition to understand average performance/scores.
Dependent variable
What you measure.
True experiment
Give one group a pill, another a sugar pill, and compare results.
Quasi-experiment
Like an experiment but without random groups. Example: Comparing two classrooms already made.
Mean
Use for normal data.
Median
Use when there are outliers.
Mode
Use for categories or most common number.
Positive skew
Tail goes right. Mean is biggest.
Negative skew
Tail goes left. Mean is smallest.
Type I error
Saying there is a difference when there isn't.
Type II error
Missing a real difference.
p-value
If p is less than 0.05, it's likely real. If p is more than 0.05, it might be chance.
Post-hoc tests
ANOVA tells if groups are different. Post-hoc shows which groups differ.
Covariance
Shows if two things change together.
Pearson's r
Use when both numbers are continuous. Don't use if data is not normal.
Correlation strength
Closer to 1 or -1 means stronger.
r²
Shows how much one thing explains the other. r² = 0.64 means 64% of the change is explained.
Positive correlation
Dots go up.
Negative correlation
Dots go down.
Null hypothesis for correlation
No link (r = 0).
Restriction of range
Not enough spread in data. Makes correlation look smaller.
Outliers
One weird point can mess it up.
Linear regression
Use when you want to predict one thing from another.
Standard error of estimate
Tells how far predictions are from real values.
Multiple regression
Use when you predict with two or more things at once.
Nonparametric tests
Tests for data that isn't normal or is in groups.
Chi-square goodness-of-fit
Use when checking if one variable fits what you expected.
Chi-square test of independence
Use when checking if two groups are related.
Cramer's V
It shows how strong the link is in chi-square. From 0 (no link) to 1 (very strong).