Parametric Tests

Basic

Parametric tests are more power efficient (better at detecting genuine differences) but there are restrictions on what data can be used.

Assumptions

  1. Interval level data
  2. homogeneity of variance
  3. normally distributed level of data

\

Simple Tests of Difference

Differences or CorrelationType of Design
Related t-test: mean of differences between pairs of related value.Differencesrelated (repeated measures/matched pairs)
Unrelated t-test: difference between two means of two sets of unrelated valuesdifferencesindependent

\

More than 2 Conditions

One Way ANOVA (Analysis of Variance)

One Way ANOVA: tests the null hypothesis that two or more samples were drawn from the same population by comparing means

compares the between-group variation with the within-group variation

Differences or CorrelationsType of Design
Unrelated ANOVADifferencesbetween groups/subjects (independent)
Related ANOVADifferenceswithin groups/subjects (repeated measures)
Mixed ANOVADifferencesbetween and within groups

Interpreting ANOVA Example: reject the null hypothesis that the sample means are of groups with identical population means

Source of variationSum of SquaresdfMean sum of squaresF ratioProbability of F
between groups45.17222.594.446p<0.05
error45.7595.08
total90.9211
Interpreting the F result: A priori and Post hoc comparisons

Post hoc comparisons: comparisons made after inspecting the results of the ANOVA

A priori comparisons: comparisons we can make, having made a specific prediction, based o theoretical argument, before conducting ANOVA

post hoc comparisons produce a much higher probability of making a Type 1 error than occurs with a priori

A Priori TestsPost hoc Tests
Bonferroni t tests - several planned comparisonsNewman-Keuls test - all possible pairs of means
linear contrasts - one or two planned comparisons between pairs of means/combinations of groupsTukey’s (honestly significant difference) test = all possible pairs of means where there are 5+ groups
Multivariate ANOVA

each independent variable is known as a factor and each of these have several levels

if all the factors of a complex ANOVA design are between groups (independent samples for each level) then it is an unrelated design

if all participants undergo all combinations of conditions (appear in every cell of the table), it is a repeated measures design

if at least one of the factors is unrelated and at least one repeat measure, then it is a mixed design

Main Effect: occurs when onne of the IVs has an overall significant effect

Simple Effect: when we extract part of a multi-factor ANOVA result and look at just the effect of one level of one IV across one of the other IVs

Interpreting multivariate ANOVA example. = neither IV had a significant effect in isolation (no main effect). However, there is a significant interaction effect (could prompt testing simple effect)

Source of VariationSum of SquaresdfMean sum of squaresF ratioProbability of F
Between groups: IV 1NS
Between groups: IV 2NS
Between groups: Interaction (1x2)p<0.01
Error
Total
Repeated measures ANOVA (one way or multi-factor)

Interpreting one-way repeated measures ANOVA example: hypothesis that significant difference between conditions is supported

Source of VariationSum of SquaresdfMean sum of squaresF ratioProbability of F
between subjects
within subjects
between conditionsp<0.001
Error
Total

Interpreting two-way repeated measures ANOVA example: significant main effects of both factors and significant interaction effect

Source of VariationSum of SquaresdfMean sum of squaresF ratioProbability of F
<<Factor A<<p<0.02
Factor Bp<0.02
Interactionp<0.05
Error
Total

Interpreting two-way mixed modal (2 unrelated, 3 related) ANOVA example: IV 2 had an effect but it is limited to 1.1 only

Source of VariationSum of SquaresdfMean sum of squaresF ratioProbability of F
<<Between subjects (IV 1.1 and IV 1.2)<<NS
within subjects (IV 2)p<0.01
within subjects (interaction 1.1/1.2x2)p<0.001
Error
Total

\