1/31
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Distributive Laws
A ∪ (B ∩ C) = (A ∪ B) ∩ (A ∪ C)
A ∩ (B ∪ C) = (A ∩ B) ∪ (A ∩ C)
De Morgan's Laws
(A ∪ B)' = A' ∩ B' and (A ∩ B)' = A' ∪ B' (Signs Swap)
mutually exclusive events
events have no intersection, cannot both occur A1 ∩ A2 = Ø
nPr
n!/(n-r)! Used with n different objects, select r and order them
nCr
n!/r!(n-r)! = nC(n-r) Used with n different objects, choose r no order
formula for non-distinct objects
nCr Used with n objects of 2 types, r of type 1, n-r of type 2
0!
1
nC0
nCn-0 = nCn = 1
nC1
nCn-1 = n
Conditional Probability Formula
P(A|B) = P(A ∩ B) / P(B)
Complement Rule of Conditional Probability
P(B'|A) = 1 - P(B|A)
Independent Events
P (A ∩ B) = P(A) x P(B), if new information is given and it doesn't update the probability of an event occurring, P(B|A) = P(B)
Independence vs. Mutually Exclusive Events
If A and B are independent, then they cannot be mutually exclusive
If A and B are mutually exclusive, then they cannot be independent
Complements of Independent Events
If A and B are independent
A and B' are independent
A' and B are independent
A' and B' are independent
Pairwise Independence
for a triplet, can be pairwise independent but not mutually independent
Occurs when events
A and B are independent
B and C are independent
A and C are independent
Mutual Independence
A, B, and C are mutually independent if they are pairwise independent
AND P(A ∩ B ∩ C) = P(A) x P(B) x P(C)
Complements of Independent Events
If A, B, and C are mutually independent then all combinations of them and their complements will also be mutually independent
what to do with questions about independence
use intersections somehow to be able to multiply probabilities
Addition Law
P(A ∪ B) = P(A) + P(B) - P(A ∩ B)
Properties of PMF
if f(x) is the pmf than it is greater than or equal to 0
the Sigma of all x within S = 1
P(X within A) = Sigma of x within (S ∩ A)
CDF
cumulative distribution function
F(x) = P(X less than or equal to x) for any x within all real numbers
mean
average value of X, sigma of all x within S (xf(x)) - multiply each x by f(x) and add them together
variance
sigma of all x within S (x-mean)^2 f(x)- x minus the mean squared times f(x) OR x squared times f(x) then subtract the mean squared
Expectation
if X has a pmf f(x) then for any function u(x), the expectation of u(x) is sigma u(x) f(x) - or u(x) times f(x) added for all values of x
Expectation in regards to x
the expectation when u(x) = x, is sigma f(x) x - or the mean, f(x) times x
Expectations with constants
if k is a constant then E [k] = k, if E[k u(x)]= k E[u(x)]
Linearity of Expectations
E [k1 u(x) + k2 u2(x)] = k1 E[u(x)] + k2 E[u2(x)]
mean and variance when Y = aX + B
mean of Y = a(mean of X) + b ||||| variance of Y = a^2 (variance of X)
Binomial Random Variable
trial with 2 outcomes (success and failure), repeated independently - called a Bernoulli trial
p represents the probability of success in the n number of trials
X represents number of successes in the n number of trials
f(x) = nCx p^x (1-p)^(n-x) for x=0,1,2...
Mean and variance of binomials
if X~ (n,p) then the mean = np, and variance = np(1-p)
X ~ b (n,p) meaning
b - undichtes binomial
n - number of trials
p - probability of success in each trial
Linearity of Binomials
if X1 ~ b (n,p) and X2 = n-X1 then X2 ~ b(n, 1-p)