1/90
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Correlation
a relationship between two variables
Positive relationships
an increase in one variable predicts an increase in the other
Negative relationships
an increase in one variable predicts a decrease in the other
A correlation alone cannot be used to make a definitive statement about
causation
Sign in a correction
Direction
Number in a correction
Strength
Which graph is the most effective way of presenting relationship data
Scatterplots
Negative relationship scatterplot
slanting downward
Positive relationship scatterplot
Slanting upward
Curvilinear relationships scatterplot
Curve in the graph
Assumptions of the Pearson Correlation
- Uses two variables
- Both quantitative*
- Linear relationship
- Minimal skew/no large outliers
- Must observe the whole range for each variable
Pearson r correlation coefficient
a way of numerically expressing correlation
Pearson r range
-1 to +1
(Nonparametric Analysis) Spearman's Rank data
for ordinal and skewed data
Kendall's tau-b
for ordinal and skewed data, less affected by error
ETA
a special coefficient used for curvilinear relationships
Interpreting Correlation Values calculation
r^2 * 100
r^2 * 100 meaning
% change in variable accounted by another variable
Regression
Predict an output
How does Regression differ from Correlation? (correlation)
quantifies the strength of the linear relationship
How does Regression differ from Correlation? (regression)
Expresses the relationship in the form of an equation.
Finding a linear regression line equation
y = mx + b
What is a requirement regarding the number of variables in linear regression?
Requires 2 or more scalar variables
What types of variables are involved in linear regression?
Dependent variable and one or more Independent variables
What type of relationship does linear regression assume between variables?
Linear relationship
What is the homoscedasticity assumption in linear regression?
The data must be homoscedastic
Homoskedasticity
the property of a dataset having variability that is similar across it's whole range
heteroskedastic
Graph gets wider the more time goes on
Linear Regression Output (R)
correlation of model output
R^2
coefficient of determination
Adjusted R2
independent variables and the sample size
Std. Err. of the Estimate
A measure of how accurately the model predicts the dependent variable
ANOVA tells us
the independent variables overall predict the dependent
Unstandard B
the unit change in the dependent per unit change in the independent
Beta
tells you how strongly this variable predicts the dependent
t & sig
if the variable was a significant predictor of the dependent
t-test
evaluate the size and significance of the difference between two means
One-sample t-Test
When you want to compare a sample mean to some known or hypothesized value
Independent samples t-Test
compare two groups to one another
Repeated Measures t-Test
How a group changes over time
One sample t-test data
interval or ratio data
What is unknown in a One sample t-test
The true standard error of the mean
Degrees of freedom in a one-sample t-test.
n -1
one-sample t-test statistics (t)
test statistic
one-sample t-test statistics (df)
degrees of freedom
one-sample t-test statistics (sig)
p value for the test statistic
Independent Samples t-test data
interval or ratio data
What makes a Independent Samples t-test different from one another
The samples are independent from one another
Independent Samples t-test Degrees of freedom
n-2
Test Variables
the variables you want to investigate
Grouping Variable
the variable that will split your data into two groups
Paired Samples t-Test data
interval or ratio data
Paired Samples t-Test measures
two variables with values paired by subject
Paired Samples t-Test Degrees of freedom
(n / 2) -1
Reporting Pearson correlation
r(N) = .sig, p =
Nonparametric tests
No need for normality
Nonparametric tests data
ranked/ordinal data
The 1-Sample Kolmogorov
Smirnov is used to test normality of data
Independent Samples will produce
Mann-Whitney U
Related Samples will let you calculate
Wilcoxon Rank-Sum
Exact significance
the exact significance is calculated from all potential distributions
Asymptotic significance
calculated using an estimated curve
Monte-Carlo significance
uses a random process to estimate the significance using areas under the curve
One Sample T
comparing a sample to a hypothesized mean
Independent Samples T
comparing two groups on some value
Repeated Measures T
comparing two variables within a set of subjects
Parametric or Nonparametric?
Parametric because they are more statistically powerful
ANOVA stand for
ANalysis Of VAriance
ANOVA concerns
sources of variance in a dataset
One-Way ANOVA is more similar to
t-Tests than it is to the other ANOVA tests
F-ratio
The variability between groups divided by the variability within groups
What kind of test is ANOVA
Omnibus test
Omnibus test
an overall difference exists, without going into the specifics of any differences
One Factor ANOVA type of data
interval or ratio data
Samples in One Factor ANOVA
ndependent from one another
Degrees of freedom in One Factor ANOVA
(number of scores) - 1
Repeated Measures ANOVA subjects
independent from one another
Repeated Measures ANOVA data
Normal
sphericity for ANOVA
variance of each variable is equal
Multivariate Tests
How strongly the factor accounts for changes in the variables
Mauchly's Test
if the variances are equal
Greenhouse-Geisser Correction
Adjustment for violations of sphericity in ANOVA.
Multivariate tests
Allows for investigating the effect of multiple independent variables on one dependent variable
Main Effect
This is the effect of a single factor on the dependent variable
Interaction
Interactions ask if the main effects of two different factors affect one another
Post-Hoc analyses
statistical analyses that the researcher did not plan for before data collection or analyses began.
when will a Post-Hoc analyses get run
Getting a significant result on the initial test
Fisher's LSD (Least Significant Differences)
Only used after a significant F test, but tends to raise alpha
Bonferroni Correction
Tests each comparison at α / n
Tukey's HSD
Alternative to the LSD, called "Tukey" in SPSS
Dunnett
Used to compare many groups to a single control