CG

Design of Experiments and Analysis of Variance

Elements of a Designed Experiment

  • Response Variable

    • The variable of interest to be measured in the experiment.
    • Also termed the dependent variable. Typically quantitative in nature.
  • Factors

    • Variables whose effect on the response is of interest, known as independent variables.
    • Types of factors:
    • Quantitative: Measured on a numerical scale.
    • Qualitative: Not measured numerically.
  • Factor Levels and Treatments

    • Factor levels: Values of factors used in the experiment.
    • Treatments: Combinations of factor levels used.
  • Experimental Unit

    • The object on which the response and factors are observed or measured, e.g., subjects, plots, etc.
  • Types of Studies:

    • Designed experiments: Analyst controls the specification of treatments and assignment of experimental units.
    • Observational studies: Analyst observes treatments and responses without control.

Completely Randomized Design: Single Factor

  • Definition: A design where experimental units are randomly assigned to treatments, ensuring independence and homogeneity among subjects.

  • Characteristics:

    • One factor or independent variable considered.
    • Analyzed using one-way Analysis of Variance (ANOVA).
  • Example: Study assessing consumer preference for bottled water brands.

    • Factors: Bottled water brands (e.g., Brand A, B, C).
    • Dependent Variable: Taste preference rated on a scale of 1-10.

ANOVA F-Test

  • Purpose: Tests the equality of means among two or more population groups.

    • Involves one nominal independent variable and one interval/ratio scaled dependent variable.
  • Partitioning Total Variation:

    • Total variation is divided into:
    • Between Groups Variation (due to treatment)
    • Within Groups Variation (error).
  • Test Statistic: F = MST / MSE

    • MST: Mean Square for Treatment
    • MSE: Mean Square for Error.
  • Degrees of Freedom:

    • Numerator: ν1 = k - 1
    • Denominator: ν2 = n - k
    • Where k = number of groups and n = total sample size.

ANOVA Summary Table:

  • Structure includes sources of variation, degrees of freedom, sum of squares, mean square, and the F statistic for treatment and error.

Critical Value for F-Test

  • Determine critical value using F-distribution based on degrees of freedom
    • If computed F > critical value, reject the null hypothesis (H0).

Conditions for Valid ANOVA F-test

  1. Samples must be randomly selected from k treatment populations.
  2. All k populations must approximately follow a normal distribution.
  3. All k population variances must be equal.

Multiple Comparisons of Means Methods

  • Purpose: To compare treatment means when ANOVA indicates treatment differences.

  • Tukey Method: Requires balanced design, involves pairwise comparison.

  • Bonferroni Method: Effective for balanced or unbalanced designs, pairwise comparisons.

  • Scheffé Method: Allows for general contrasts of means suitable for balanced or unbalanced design.

Experimentwise Error Rate (EER)

  • Represents the risk associated with making at least one Type I error during multiple mean comparisons in ANOVA.

Practical Steps for Conducting an ANOVA

  1. Confirm the design is completely randomized.
  2. Check assumptions of normality and equal variances.
  3. Create an ANOVA summary table detailing variability due to treatments and error.
  4. If treatment means differ (based on F-test), conduct multiple comparisons as needed.
  5. Assess results carefully, considering potential Type II errors if the null hypothesis is not rejected.