Last saved 3 days ago
MI

Probability Theory - Refresher

Sample Space ( Ω )

  • Represents all possible outcomes of an experiment.

Event

  • A specific subset of outcomes from the sample space.

Probability Function

  • Represents the mapping from an event to a real number, denoted as ( P(event) ).

    • Conditions: ( 0 \leq P(event) \leq 1 )

    • Sure event: ( P(Sure event) = 1 )

    • Impossible event: ( P(Impossible event) = 0 )

Intersection and Union of Events

  • Intersection ( P(A \cap B) ): Probability of both events A and B occurring.

  • Union ( P(A \cup B) ): Probability of either event A or B occurring.

    • ( P(A \cup B) = P(A) + P(B) - P(A \cap B) )

Disjoint Events

  • Events A and B are disjoint if they have no outcomes in common ( (A \cap B = \emptyset) ).

    • Example: Probability that either A or not A occurs must total 1: ( P(A \cup
      eg A) = P(A) + P(
      eg A) = 1 )

Practical Examples

  • For a fair die, ( P({1, 2, 3, 4, 5, 6}) = 1 )

    • Each outcome of the die is equally likely: ( P({k}) = \frac{1}{6} ) for ( k \in {1, 2, 3, 4, 5, 6} )

  • Probability of an even number appearing: ( P({2, 4, 6}) = \frac{1}{2} )

  • Mixed events example: ( P(\text{even number or } 6) ) not disjoint so calculation changes accordingly.

Interpretations of Probability

Frequentist Interpretation

Subjectivist Interpretation

  • Represents personal belief regarding the likelihood of an event.

Conditional Probability

  • , given prior knowledge of another event:

    • General formula: ( P(A|B) = \frac{P(A, B)}{P(B)} ) (for ( P(B) > 0 ))

Multiplicative and Chain Rules

  • Multiplicative Rule:

    • Probability of both A and B: P(A, B) = P(B) \ P(A|B)

  • Chain Rule:

    • For multiple events: ( P(A1, A2, …, An) = P(A1) \cdot P(A2|A1) \cdots P(An|A1,…,An−1) )

Theorem of Total Probability

  • For events A and B:

    • ( P(A) = P(A, B) + P(A,
      eg B) ) where B and (
      eg B ) cover all outcomes.

Independence of Events

  • Events A and B are independent if: ( P(A, B) = P(A) \cdot P(B) )

    • Happenstance of one does not affect the other.

Conditional Independence

  • Events A and B are conditionally independent given event C:

    • ( P(A, B|C) = P(A|C) \cdot P(B|C) )

Random Variable

  • A variable representing outcomes of random phenomena (e.g., numbers on die throws).

    • Example: ( X ) is the outcome from one throw; ( Z ) is the sum from two throws.

Probability Distribution

  • Specifies the probability for each outcome of a random variable, summing to 1.

    • Example distribution for a biased die shown.

Joint Probability Distribution

  • Defines probabilities across combinations of multiple random variables.

  • Example: Joint distribution for conditions involving Fever, Headache, and Flu.

Marginal Probability Distributions

  • Extracting probabilities for subsets of variables from joint distributions by summing relevant combinations.

Conditional Probability Distributions

  • Computation of conditional probabilities based on previously calculated marginal distributions.

    • Example: Probability of Flu given Fever and Headache.

Probability Density Functions

  • Describes continuous random variables:

    • Must meet requirements: ( f(x) \geq 0 ) and ( \int_{-\infty}^{\infty} f(x) dx = 1 )

  • Represents areas rather than exact values.

Common PDFs

  • Uniform Distribution: Equal probability across its range.

  • Normal Distribution (Gaussian): Characterized by mean (( \mu )) and standard deviation (( \sigma )).

  • Exponential Distribution: Models the time until an event occurs.

  • Mixture of Gaussians: Approximation of complex distributions by combining multiple normal distributions.


robot
knowt logo

Probability Theory - Refresher

Sample Space ( Ω )

  • Represents all possible outcomes of an experiment.

Event

  • A specific subset of outcomes from the sample space.

Probability Function

  • Represents the mapping from an event to a real number, denoted as ( P(event) ).

    • Conditions: ( 0 \leq P(event) \leq 1 )

    • Sure event: ( P(Sure event) = 1 )

    • Impossible event: ( P(Impossible event) = 0 )

Intersection and Union of Events

  • Intersection ( P(A \cap B) ): Probability of both events A and B occurring.

  • Union ( P(A \cup B) ): Probability of either event A or B occurring.

    • ( P(A \cup B) = P(A) + P(B) - P(A \cap B) )

Disjoint Events

  • Events A and B are disjoint if they have no outcomes in common ( (A \cap B = \emptyset) ).

    • Example: Probability that either A or not A occurs must total 1: ( P(A \cup
      eg A) = P(A) + P(
      eg A) = 1 )

Practical Examples

  • For a fair die, ( P({1, 2, 3, 4, 5, 6}) = 1 )

    • Each outcome of the die is equally likely: ( P({k}) = \frac{1}{6} ) for ( k \in {1, 2, 3, 4, 5, 6} )

  • Probability of an even number appearing: ( P({2, 4, 6}) = \frac{1}{2} )

  • Mixed events example: ( P(\text{even number or } 6) ) not disjoint so calculation changes accordingly.

Interpretations of Probability

Frequentist Interpretation

Subjectivist Interpretation

  • Represents personal belief regarding the likelihood of an event.

Conditional Probability

  • , given prior knowledge of another event:

    • General formula: ( P(A|B) = \frac{P(A, B)}{P(B)} ) (for ( P(B) > 0 ))

Multiplicative and Chain Rules

  • Multiplicative Rule:

    • Probability of both A and B: P(A, B) = P(B) \ P(A|B)

  • Chain Rule:

    • For multiple events: ( P(A1, A2, …, An) = P(A1) \cdot P(A2|A1) \cdots P(An|A1,…,An−1) )

Theorem of Total Probability

  • For events A and B:

    • ( P(A) = P(A, B) + P(A,
      eg B) ) where B and (
      eg B ) cover all outcomes.

Independence of Events

  • Events A and B are independent if: ( P(A, B) = P(A) \cdot P(B) )

    • Happenstance of one does not affect the other.

Conditional Independence

  • Events A and B are conditionally independent given event C:

    • ( P(A, B|C) = P(A|C) \cdot P(B|C) )

Random Variable

  • A variable representing outcomes of random phenomena (e.g., numbers on die throws).

    • Example: ( X ) is the outcome from one throw; ( Z ) is the sum from two throws.

Probability Distribution

  • Specifies the probability for each outcome of a random variable, summing to 1.

    • Example distribution for a biased die shown.

Joint Probability Distribution

  • Defines probabilities across combinations of multiple random variables.

  • Example: Joint distribution for conditions involving Fever, Headache, and Flu.

Marginal Probability Distributions

  • Extracting probabilities for subsets of variables from joint distributions by summing relevant combinations.

Conditional Probability Distributions

  • Computation of conditional probabilities based on previously calculated marginal distributions.

    • Example: Probability of Flu given Fever and Headache.

Probability Density Functions

  • Describes continuous random variables:

    • Must meet requirements: ( f(x) \geq 0 ) and ( \int_{-\infty}^{\infty} f(x) dx = 1 )

  • Represents areas rather than exact values.

Common PDFs

  • Uniform Distribution: Equal probability across its range.

  • Normal Distribution (Gaussian): Characterized by mean (( \mu )) and standard deviation (( \sigma )).

  • Exponential Distribution: Models the time until an event occurs.

  • Mixture of Gaussians: Approximation of complex distributions by combining multiple normal distributions.