In-depth Notes on Conditional Probability and Related Topics

1.3 Conditional Probability

  • Definition: Conditional probability adjusts the probability model to reflect partial information about an experimental outcome.
  • The sample space of this corrected model is smaller than the original model's.
Concepts:
  • Prior Probability: The probability of event A occurring given the background of all events.
    • Notation: P(AS)=P(A)P(A|S) = P(A)
  • Posterior Probability: The probability of event A occurring given the new event B.
    • Notation: P(AB)P(A|B)
Example 1.9:
  • Consider testing two Integrated Circuits (IC) from the same silicon wafer with outcomes accepted (a) or rejected (r).

  • The sample space S = {rr, ra, ar, aa}.

    • Let B be the event the first chip is rejected: B=rr,raB = {rr, ra}
    • Let A be the event the second chip fails: A=rr,arA = {rr, ar}
  • These chips are from a high-quality production line, thus prior probability P[A]P[A] is very low.

  • However, if some silicon wafers are contaminated with dust, the failure of chips increases.

  • Given event B occurred, the probability of event A occurring is higher than prior probability P[A]P[A] because it’s likely that dust contaminated the entire wafer.

Definition 1.5: Conditional Probability
  • P(AB)=P(AB)P(B)P(A|B) = \frac{P(A \cap B)}{P(B)}
    • This represents the probability of event A given B.
Example Calculations:
  1. Given the sample space S = {x| 0<x<= 10} with integers {1, 2, …, 10}:
    • Let A = {1, 2, 3} and B = {2, 4, 6, 8, 10}.
    • Probability calculations:
      • P(B)=510=0.5P(B) = \frac{5}{10} = 0.5
      • P(A)=510=0.5P(A) = \frac{5}{10} = 0.5
      • P(AB)=210=0.2P(A \cap B) = \frac{2}{10} = 0.2
    • Therefore conditional probability:
      • P(AB)=0.20.5=0.4P(A|B) = \frac{0.2}{0.5} = 0.4
Properties of Conditional Probability:
  1. P[AB]0P[A|B] \geq 0
  2. P[BB]=1P[B|B] = 1
  3. If P(A) > 0 and P(B) > 0, then:
    • P(AB)P(B)=P(AB)P(A|B) \cdot P(B) = P(A \cap B)
Example 1.10:
  • Given prior probabilities: P[rr]=0.01,P[ra]=0.01,P[ar]=0.01,P[aa]=0.97P[rr] = 0.01, P[ra] = 0.01, P[ar] = 0.01, P[aa] = 0.97.
  • Request to find probabilities of A and B and their conditional probability.
Example 1.12 and Example 1.13:
  • Testing four coins for heads (h) or tails (t) generates a sample space with 16 outcomes.
  • Each event can be analyzed separately, which constitutes a partition of the sample space.

1.4 Partition and Law of Total Probability:

  • A partition splits the sample space into mutually exclusive sets.
  • The law of total probability states the probability of an event A can be expressed as the sum of the probabilities of A over the individual components of a partition.
Example 1.15:
  • Classifying emails into categories based on length and type yields a sample space S = {lt, bt, li, bi, lv, bv}.
  • Example shows how to apply the law of total probability to find probabilities based on these classifications.
Bayes’ Theorem (Theorem 1.10):
  • P[BA]=P[AB]P[B]P[A]P[B|A] = \frac{P[A|B] \cdot P[B]}{P[A]}
Example 1.17:
  • Calculating probabilities for acceptable resistors from different manufacturing machines considering their production rates and quality qualifications.
Independence of Events:
  • Two events are independent if the occurrence of one does not affect the probability of the other.

  • Examples demonstrate independence (or lack thereof) across several scenarios.