1/84
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Random Experiment
Process with an outcome that cannot be predicted without certainty
Sample Space
S is the set of ALL possible outcomes, it can be discrete or infinite.
Event
A subset of the sample space, Any subset can be an event including the whole space and the null spaceas well as individual outcomes.
When do we say “An event Occurs”
when the outcome of a random experiment is included in that event's subset of the sample space.
What is the motivation of probability
It serves to quantify uncertainty and make informed predictions about the likelihood of various outcomes.
Subjective
P(A) equals the probability that A occurs, some value between 0 and 1(Inclusive) based on our degree of belief from less likely to more likely
Classical Approach
Known as the equiprobable model. Suppose that s is finite and that the outcomes are equally likely to occur, then the probability of an event is determined by counting the number of favorable outcomes and dividing by the total number of outcomes. P(A) = n(A)/n(S) → the # of outcomes in A / the # of total possible outcomes.
Frequentist Approach
Used when the experiment is repeated many times. Let n be the number of trials and let Fn(A) be the number of times that A has occurred among the n trials. P(A) = lim n → inf Fn(A) /n.
A n B
Intersection of A and B. This is JUST where they meet
AUB
Union of A and B. This represents all outcomes in either A, B, or both.
What is the compliment of A
The complement of A consists of all outcomes in the sample space that are not in A, denoted as A' or A^c.
What are mutually exclusive events
Mutually exclusive events are events that cannot occur at the same time. If one event occurs, the other cannot. This is how the compliments work.
What are the axioms of probability
i) Positivity
ii) Certainty
iii) Additive
What is postivity
P(A) >= 0 forall A
What is certianity
P(S) = 1
What is additive
Let A1, A1, … be mutally exclusive, then for k>0:
P(A1 U A1 U …. U Ak) = P(A1) + P(A2) + … + P(Ak)
and
P(A1UA2U…) = P(A1) + P(A2) + ….
What is the probability of the null space
0
If A ⊆ B, then
P(A) <= P(B)
P(A) (using compliment)
1-P(Ac)
P(AnBc)
P(A) - P(AnB)
P(AUB)
P(A) + P(B) - P(AnB)
What is the Multiplication Principle
There are K tasks. For the ith task, there are ni ways to preform it. The number of ways of preforming the whole set k is n!
What is permutation
An ordered arrangement of n objects. In general, n! permutations of n objects.
What is combination
A selection of objects from a set without taking account of their order.
nPr
The number of permutations of r objects chosen from a set of n objects
nPr formula
n!/(n-r)!
nCr
The number of combinations of r objects chosen from a set of n objects
nCr formula
n!/(r!(n-r)!)
What is the multinomial coefficient
It is a generalization of the binomial coefficient, representing the number of ways to distribute n distinct objects into k distinct boxes, where each box contains a specified number of objects. Think ( n choose n1, n2, etc)
What does P(*|B) satisfy
the probability axioms
P(A|B)
P(AnB)/P(B)
P(AnB)
P(A|B)P(B)
P(BnA)
P(B|A)P(A)
Total Probability Rule P(B)
P(B|A1)P(A1) + P(B|A2)P(A2) + …
Bayes’ Rule: P(Ai|B)
P(AinB)/P(B)
P(AnB)
P(A)*P(B)
What is an independent event
An independent event is an occurrence where the outcome of one event does not affect the outcome of another. In probabilistic terms, two events A and B are independent if P(A|B) = P(A) and P(B|A) = P(B).
If A and B are independent, so are
i) A and B^c ii) A^c and B iii) A^c and B^c
Events A,B,C are mutually independent if
i) A and B are independent
ii) B and C are independent
iii) A and C are independent
Bayes’ Thm
P(A|B) = P(B|A)P(A)/P(B)
Random Variable
Real - Valued function from the sample space S to R
Discrete Random Variable
If the set of possible values of X is finite or countably finite then x is a discrete random variable.
Continuous Random Variable
If the set of possible values of x includes at least one interval of R (not just a point), the x is a continuous random variable (NOT a continuous function!!!)
Examples of Continuous Random Variables
Volume, pressure, temp, distance, time
pmf
(probability mass function) describes the probability distribution of a discrete random variable by assigning probabilities to each of its possible values.
cdf
Cumulative distribution function; gives the probability that a continuous random variable is less than or equal to a certain value.
expectation of h(x)
sum of xEsx of [h(x)*f(x)]
Exception is a ___________
linear combination
mean formula
x represented as ( \mu = 1/N \sum{i=1}^{N} xi for a set of values.
Varaince Formula
E[(x-mux)²] = E(x)² - mux²
Standard Deviation Formula
sigma = sqrt(V(x))
nth moment of X
E(X^n) = sum of xesX x^nf(x)
First moment
u = E(x)
What is the moment generating function
It is defined as M(t) = E(e^(tX)), where t is a real number and E is the expected value.
What are the three main properties of M(t)
a) M(0) = 1
b)M’(0) = E(x)
c)Two distinct paths will give different mgf
What is the Bernouli trials
Is a random expirement where we describe the outcome as a success or a failure
X ~ b(n,p) formula
(n choose x)p^x(1-p)^(n-x)
How to optimize mean squared error
by minimizing the discrepancy between predicted and actual values.
What is a moment generating function
A moment generating function is a tool used in probability theory to summarize all moments of a random variable, allowing for the derivation of various characteristics such as mean and variance.
Why is it called the moment generating function
because it generates moments of a random variable through differentiation, helping in calculating expected values. This is based on the derrivative taken
What is a bernoulli Trial
A Bernoulli trial is a random experiment that has exactly two possible outcomes: success or failure. It forms the basis for binary probability models.
M(t) for Bernoulli
is the moment generating function for a Bernoulli trial, defined as M(t) = p e^t + (1 - p), where p is the probability of success.
What is variance of Bernoulli trial
The variance of a Bernoulli trial is given by the formula Var(X) = p(1 - p), where p is the probability of success.
What are the two ways to calcularte moments
From mgf or from pmf
How to calculate moments from pmf
The moments of a random variable can be calculated from its probability mass function (pmf) by using the formula E[X^k] = \sum_{x} x^k P(X = x), where k is the moment order.
How to calculate moments from mgf `
The moments of a random variable can be calculated from its moment generating function (mgf) by taking the derivatives of the mgf at zero, specifically using the formula E[X^k] = M^{(k)}(0), where M(t) is the mgf of the random variable.
What is the third moment
of a random variable, it is calculated as E[X^3] and provides information about the skewness of the distribution.
What is the poisson Distribution
It is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space, given that these events occur with a constant mean rate and independently of the time since the last event.
What is the pmf of Poisson
The probability mass function (pmf) of the Poisson distribution is given by P(X = k) = (e^{-\lambda} \lambda^k) / k!, where \lambda is the average rate of occurrence and k is the number of events.
What is the expectation of Poisson
The expectation or mean of a Poisson distribution is equal to its parameter b, denoted as E[X] = b, which represents the average number of events in the given interval.
What is the mean of Poisson
The mean of a Poisson distribution is equal to its parameter ( \lambda ), representing the average number of occurrences in a specified interval.
How is Poisson simular to Binomal
It is an approximation to the binomal, as we allow the binomal to be fixed and then we let p gert small so that n becomes very large. This approach is used when the number of trials is large, and the probability of success is small, leading to a Poisson distribution that approximates the Binomial distribution.
What is the geometric distribution
Number of bernouli trials with parameter p required to observe a first success is a discrete probability distribution that models the number of Bernoulli trials needed until the first success occurs, where each trial has a success probability of p.
pmf of geo
given by the formula P(X = k) = (1 - p)^{k-1} p, where k is the number of trials until the first success.
M(t) of geo
pe^t /(1-e^t(1-p))
mean of geo
1/p
sigma² of geo
(1-p)/p²
WHat is negative binomial distribution
Number of bernoulli trials with parameter p required to observe r successes
pmf of neg
f(x) = (x-1 choose r-1)p^r(1-p)^(x-r)
mean of neg
r/p
sigma² of neg
r(1-p)/p²
M(t) of neg
M(t) = (p e^t / (1 - (1-p)e^t))^r
What is hypergeometric distrubution
The hypergeometric distribution describes the probability of drawing a specific number of successes in a sample without replacement from a finite population that contains a specific number of successes.
pmf of hypergeometric
(N1 choose x)(N2 choose n-x) / (N choose n)
mean of hypergeometric
E(X) = n \frac{K}{N}