Represents all possible outcomes of an experiment.
A specific subset of outcomes from the sample space.
Represents the mapping from an event to a real number, denoted as ( P(event) ).
Conditions: ( 0 \leq P(event) \leq 1 )
Sure event: ( P(Sure event) = 1 )
Impossible event: ( P(Impossible event) = 0 )
Intersection ( P(A \cap B) ): Probability of both events A and B occurring.
Union ( P(A \cup B) ): Probability of either event A or B occurring.
( P(A \cup B) = P(A) + P(B) - P(A \cap B) )
Events A and B are disjoint if they have no outcomes in common ( (A \cap B = \emptyset) ).
Example: Probability that either A or not A occurs must total 1: ( P(A \cup
eg A) = P(A) + P(
eg A) = 1 )
For a fair die, ( P({1, 2, 3, 4, 5, 6}) = 1 )
Each outcome of the die is equally likely: ( P({k}) = \frac{1}{6} ) for ( k \in {1, 2, 3, 4, 5, 6} )
Probability of an even number appearing: ( P({2, 4, 6}) = \frac{1}{2} )
Mixed events example: ( P(\text{even number or } 6) ) not disjoint so calculation changes accordingly.
Represents personal belief regarding the likelihood of an event.
, given prior knowledge of another event:
General formula: ( P(A|B) = \frac{P(A, B)}{P(B)} ) (for ( P(B) > 0 ))
Multiplicative Rule:
Probability of both A and B: P(A, B) = P(B) \ P(A|B)
Chain Rule:
For multiple events: ( P(A1, A2, …, An) = P(A1) \cdot P(A2|A1) \cdots P(An|A1,…,An−1) )
For events A and B:
( P(A) = P(A, B) + P(A,
eg B) ) where B and (
eg B ) cover all outcomes.
Events A and B are independent if: ( P(A, B) = P(A) \cdot P(B) )
Happenstance of one does not affect the other.
Events A and B are conditionally independent given event C:
( P(A, B|C) = P(A|C) \cdot P(B|C) )
A variable representing outcomes of random phenomena (e.g., numbers on die throws).
Example: ( X ) is the outcome from one throw; ( Z ) is the sum from two throws.
Specifies the probability for each outcome of a random variable, summing to 1.
Example distribution for a biased die shown.
Defines probabilities across combinations of multiple random variables.
Example: Joint distribution for conditions involving Fever, Headache, and Flu.
Extracting probabilities for subsets of variables from joint distributions by summing relevant combinations.
Computation of conditional probabilities based on previously calculated marginal distributions.
Example: Probability of Flu given Fever and Headache.
Describes continuous random variables:
Must meet requirements: ( f(x) \geq 0 ) and ( \int_{-\infty}^{\infty} f(x) dx = 1 )
Represents areas rather than exact values.
Uniform Distribution: Equal probability across its range.
Normal Distribution (Gaussian): Characterized by mean (( \mu )) and standard deviation (( \sigma )).
Exponential Distribution: Models the time until an event occurs.
Mixture of Gaussians: Approximation of complex distributions by combining multiple normal distributions.
Probability Theory - Refresher
Represents all possible outcomes of an experiment.
A specific subset of outcomes from the sample space.
Represents the mapping from an event to a real number, denoted as ( P(event) ).
Conditions: ( 0 \leq P(event) \leq 1 )
Sure event: ( P(Sure event) = 1 )
Impossible event: ( P(Impossible event) = 0 )
Intersection ( P(A \cap B) ): Probability of both events A and B occurring.
Union ( P(A \cup B) ): Probability of either event A or B occurring.
( P(A \cup B) = P(A) + P(B) - P(A \cap B) )
Events A and B are disjoint if they have no outcomes in common ( (A \cap B = \emptyset) ).
Example: Probability that either A or not A occurs must total 1: ( P(A \cup
eg A) = P(A) + P(
eg A) = 1 )
For a fair die, ( P({1, 2, 3, 4, 5, 6}) = 1 )
Each outcome of the die is equally likely: ( P({k}) = \frac{1}{6} ) for ( k \in {1, 2, 3, 4, 5, 6} )
Probability of an even number appearing: ( P({2, 4, 6}) = \frac{1}{2} )
Mixed events example: ( P(\text{even number or } 6) ) not disjoint so calculation changes accordingly.
Represents personal belief regarding the likelihood of an event.
, given prior knowledge of another event:
General formula: ( P(A|B) = \frac{P(A, B)}{P(B)} ) (for ( P(B) > 0 ))
Multiplicative Rule:
Probability of both A and B: P(A, B) = P(B) \ P(A|B)
Chain Rule:
For multiple events: ( P(A1, A2, …, An) = P(A1) \cdot P(A2|A1) \cdots P(An|A1,…,An−1) )
For events A and B:
( P(A) = P(A, B) + P(A,
eg B) ) where B and (
eg B ) cover all outcomes.
Events A and B are independent if: ( P(A, B) = P(A) \cdot P(B) )
Happenstance of one does not affect the other.
Events A and B are conditionally independent given event C:
( P(A, B|C) = P(A|C) \cdot P(B|C) )
A variable representing outcomes of random phenomena (e.g., numbers on die throws).
Example: ( X ) is the outcome from one throw; ( Z ) is the sum from two throws.
Specifies the probability for each outcome of a random variable, summing to 1.
Example distribution for a biased die shown.
Defines probabilities across combinations of multiple random variables.
Example: Joint distribution for conditions involving Fever, Headache, and Flu.
Extracting probabilities for subsets of variables from joint distributions by summing relevant combinations.
Computation of conditional probabilities based on previously calculated marginal distributions.
Example: Probability of Flu given Fever and Headache.
Describes continuous random variables:
Must meet requirements: ( f(x) \geq 0 ) and ( \int_{-\infty}^{\infty} f(x) dx = 1 )
Represents areas rather than exact values.
Uniform Distribution: Equal probability across its range.
Normal Distribution (Gaussian): Characterized by mean (( \mu )) and standard deviation (( \sigma )).
Exponential Distribution: Models the time until an event occurs.
Mixture of Gaussians: Approximation of complex distributions by combining multiple normal distributions.