Probability: Experiments, Sample Space, and a Two-Flip Coin Example

Key Concepts

  • Experiment: an act or process of observation that leads to a single outcome that cannot be predicted with certainty.
    • Example context from lecture: observing whether tomorrow’s smog will disappear; the outcome is not predetermined.
    • Experiments can occur in daily life, not only in labs (e.g., weather, environmental events).
  • Sample space: the set of all sample points (all possible outcomes) of an experiment.
    • In a coin-toss scenario with two flips, the sample space consists of all outcomes of the two flips.
    • If the outcome of a trial cannot be predicted with certainty, and we enumerate all possible outcomes, we can assign probabilities to each.
  • Event: a subset of the sample space representing one or more outcomes of interest.
    • Example: the event "second coin shows Heads" in a two-flip experiment.
  • Probability basics
    • For a fair process, each outcome in the sample space has a known probability; the sum of all outcome probabilities equals 1.
    • For an unfair/fairness-broken process, individual outcome probabilities differ, but must still sum to 1 over the whole sample space.
  • Fair vs unfair coin (and independence)
    • Fair coin: P(H) = P(T) = rac{1}{2} per flip; different flips are independent in the standard model.
    • Unfair coin: P(H) = p, P(T) = 1 - p (0 < p < 1); each flip has the same bias, but independence is assumed unless stated otherwise.
  • Tree diagrams (visualization)
    • Step-by-step branching to model sequential experiments (e.g., two coin flips).
    • Probabilities multiply along branches; each leaf represents a sample point with its probability.
    • Helps visualize how to enumerate sample points and compute event probabilities.
  • Sample space cardinality and leaves
    • With two flips of a fair coin, the sample space has 4 leaves: HH, HT, TH, TT.
    • Each leaf has probability rac{1}{4} in the fair two-flip scenario.

Real-world intuition and everyday experiments

  • Uncertainty in outcomes is common: we often cannot predict the final outcome with certainty in many real-world situations (e.g., environmental conditions like smog).
  • An "error experiment" (as mentioned) highlights that not all uncertainties are dramatic; some are everyday variations that we can model probabilistically.
  • Dice vs coins: even with seemingly simple devices, outcomes cannot be predicted with certainty; probabilities help quantify likelihoods (e.g., P(H) = frac{1}{2} for a fair coin, P( ext{each face}) = frac{1}{6} for a fair die).

Two-flip experiment with fair coins

  • Setup
    • Two independent, fair coin flips.
    • Each flip has outcomes Heads (H) or Tails (T) with probability rac{1}{2} each.
  • Tree diagram (conceptual)
    • Level 1: first flip -> H or T (each with probability rac{1}{2}).
    • Level 2: second flip from each first-flip outcome -> H or T (each with probability rac{1}{2}).
    • Resulting leaves: HH, HT, TH, TT.
  • Sample space
    • The leaves form the sample space: S = \{HH, HT, TH, TT\}.
    • Each leaf has probability P( ext{leaf}) = rac{1}{4} for a fair coin pair.
  • Probability table and event calculation
    • Event of interest: second coin shows Heads (H on the second flip).
    • Leaves corresponding to this event: HH and TH.
    • Probabilities for leaves:
    • P(HH) = rac{1}{4}
    • P(HT) = rac{1}{4}
    • P(TH) = rac{1}{4}
    • P(TT) = rac{1}{4}
    • Compute P(second flip = Heads):
    • Using leaves: P(H_2) = P(HH) + P(TH) = rac{1}{4} + rac{1}{4} = rac{1}{2}
    • Using the law of total probability over first-flip outcomes:
      P(H2) = P(H2 \mid F1=H)P(F1=H) + P(H2 \mid F1=T)P(F_1=T) = \left( rac{1}{2}\right)\left(\frac{1}{2}\right) + \left(\frac{1}{2}\right)\left(\frac{1}{2}\right) = \frac{1}{2}
    • Both methods yield P(H_2) = rac{1}{2}, illustrating independence of flips in the fair-case.
  • Generalization to an unfair coin (bias p)
    • Per flip: P(H) = p, \, P(T) = 1 - p with 0 < p < 1.
    • Leaf probabilities for two flips:
    • P(HH) = p^2
    • P(HT) = p(1 - p)
    • P(TH) = (1 - p)p
    • P(TT) = (1 - p)^2
    • They still sum to 1:
    • p^2 + p(1 - p) + (1 - p)p + (1 - p)^2 = 1
    • Probability that the second flip is Heads remains equal to p in the independent-flips model:
    • P(H2) = P(H2 \mid F1=H)P(F1=H) + P(H2 \mid F1=T)P(F_1=T) = p \, p + p \, (1 - p) = p
  • Terminology note
    • Distinction between probability and likelihood can appear in different contexts; in this lecture, probability is used for the chance of outcomes in a random process, while likelihood may appear in statistical inference discussions.
  • Quick recap question framework (from the lecture)
    • Start with a single coin, then add another flip; visualize with a tree or stepwise expansion.
    • Determine sample points, assign their probabilities, and compute event probabilities by summing relevant leaves.
    • Confirm that the sum of all probabilities equals 1 and interpret independence vs dependence across flips.

Connections and implications

  • Foundational principle: the law of total probability allows you to compute event probabilities by conditioning on intermediate outcomes (e.g., first flip result).
  • Realistic modeling: use trees to represent sequential experiments; adapt to biased/unbiased processes by adjusting leaf probabilities.
  • Practical relevance: understanding how to enumerate outcomes and compute probabilities under uncertainty is essential in statistics and data analysis.

Quick exercises (conceptual)

  • Exercise idea: For two flips of an unfair coin with bias p, compute P(H on the second flip) and verify it equals p.
  • Exercise idea: Extend to three flips; enumerate the 8 leaves and compute P(second flip = H) by summing leaves where the second flip is H.

Personal reflections from the session

  • The instructor emphasized visualization through step-by-step branches to avoid getting lost.
  • The class activity included building a probability table for the event of interest (second flip being Heads).
  • The discussion touched on practical application of probabilities beyond classroom exercises, including everyday events and non-lab experiments.