1/34
Vocabulary flashcards for Psych 10 Final Exam Study Guide.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
ANOVA (Analysis of Variance)
Statistical method used to compare means of three or more groups to determine if at least one differs significantly.
Factor
Independent variable used to sort data into groups for comparison in ANOVA.
Level
Category or condition within a factor (e.g., 3 types of diets = 3 levels).
One-way vs. Two-way ANOVA
One-way ANOVA analyzes the effect of one factor; two-way ANOVA examines the effects of two factors and their interaction.
Between-subjects vs. Within-subjects Factor
Between-subjects factor assigns different participants to each level; within-subjects means the same participants experience all levels.
Assumptions of a One-Way ANOVA
Independence of observations, Normal distribution of residuals, Homogeneity of variances (equal variances across groups).
Post-hoc Comparisons
Tests that identify which group means differ after a significant ANOVA result.
Univariate Statistics
Analysis involving only one dependent variable.
Multivariate Statistics
Analysis involving multiple dependent variables simultaneously.
Main Effect
The effect of one independent variable on the dependent variable, averaged across levels of other variables.
Interaction Effect
Occurs when the effect of one factor depends on the level of another factor.
Collapse Over a Variable
Averaging across the levels of a variable to analyze the main effect of another variable.
Main Effect Mean
The average score for a level of one factor, across all other factors.
Cell Mean
The average score in a specific combination of two or more factor levels in a factorial design.
Complete Factorial Design
A design where all possible combinations of factor levels are tested.
Correlation Coefficient
A value (r) from -1 to +1 that shows the strength and direction of a linear relationship between two variables.
Positive Relationship
As one increases, so does the other
Negative Relationship
As one increases, the other decreases.
Regression Line
A straight line that best fits data points, showing the predicted values of Y from X.
Linear Relationship
Straight-line relationship.
Curvilinear Relationship
Curved (e.g., U-shaped) relationship.
Pearson Correlation
Measures linear relationship (interval/ratio data).
Spearman Correlation
Measures rank-order (ordinal) relationships.
Restriction of Range Problem
It reduces variability and can underestimate the true correlation.
Purpose of Regression
To predict the value of a dependent variable based on the value of one or more independent variables.
Y’ in Regression
The predicted value of the dependent variable (Y).
Predictor Variable
Independent variable (X).
Criterion Variable
Dependent variable (Y).
Linear Regression Equation
Y’ = a + bX, where a = intercept and b = slope.
Slope
Change in Y for a one-unit increase in X.
Y Intercept
The predicted value of Y when X = 0.
Variance of the Y Scores Around Y’
It refers to the spread (error) of actual Y values around predicted Y values.
Standard Error of Estimate
The average distance between actual Y values and predicted Y’ values; measures accuracy of predictions.
Homoscedasticity
Equal spread of residuals across X.
Heteroscedasticity
Unequal spread of residuals across X.