1/20
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Pearson R Correlation Test
Statistical test used to measure the strength and direction of the linear relationship between two continuous variables.
Regression Analysis
A statistical method for modeling the relationship between a dependent variable and one or more independent variables.
Regression Line Equation
The equation that represents the best fitting line through a set of data points: Y = β₀ + β₁X + ε.
Joint Variation
When a variable changes directly or inversely with two or more other variables.
Partial Correlation
A measure of the relationship between two variables while controlling for the effects of one or more additional variables.
Listwise Deletion
A data handling method where an entire row is removed if any value in that row is missing.
Pairwise Deletion
A method that excludes cases with missing values only for the specific variables being analyzed.
Standardization
The process of scaling individual data points so they can be compared across different variables.
Covariance
A measure of how much two random variables change together; does not provide a meaningful scale.
Non-Parametric Tests
Statistical tests that do not assume a specific distribution for the underlying population; used for hypothesis testing with categorical variables.
Odds Ratio
A measure of association between an exposure and an outcome; it is the odds of the event occurring in the exposed group compared to the non-exposed group.
Standardized Beta Coefficients
Indicate the change in standard deviation units of the dependent variable (Y) for each standard deviation increase in the independent variable (X).
Intercept (β₀)
The predicted value of Y when all predictors are equal to zero.
Dummy Coding
A method to convert categorical variables into a format suitable for regression analysis by creating binary (0/1) variables.
Power of a Test
The probability of correctly rejecting a false null hypothesis; influenced by sample size, effect size, and alpha level.
Sum of Square Differences (SS)
A measure of the total variability present in a set of data.
Model Sum of Squares
The portion of variability in the dependent variable that is explained by the model.
R² (R-squared)
The proportion of variance in the dependent variable (Y) that can be explained by the independent predictor variables.
Standard Deviation
A statistic that shows the dispersion or spread of a set of data.
Y
The outcome variable in regression analysis.
β₀ (Beta-0)
The intercept; the baseline value of Y when all predictors equal zero.