Discrete Probability Distributions

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/20

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

21 Terms

1
New cards

What is a probability mass function? (PMF)

This function gives the probability of each possible outcome for a discrete random variable.
f(x) = P(X=xi) = f(xi)≥0.
All you need to do is write the probability of each outcome of a discrete random variable occurring.

2
New cards

What is a cumulative distribution function?

The CDF is the PMF for ranges of outcomes of the discrete random variable.
F(x) = P(X≤x)

3
New cards

What is discrete uniform distribution?

A type of probability distribution in which a finite number of outcomes is equally likely, and the outcomes of the random variable is a set of consecutive integers. So, PMF is constant across all values.
P(X=x)=1/(b-a+1) where a is the lowest possible value of X, and b is the highest.

4
New cards

What is the expected value of a discrete uniform distribution?

μ = b+a/2

5
New cards

What is the variance of a discrete probability distribution?

σ² = (b-a+1)²-1/12

6
New cards

What is a Bernoulli Distribution?

A type of probability distribution that describes the outcome of a single binary yes/no experiment, where the discrete random variable in question has only two possible outcomes, success (1) or failure (0).
(X=x)=p^x (1-p)^(1-x), where x is either 0 or 1, and p is the probability of success, and 1-p is the probability of failure.

7
New cards

What is the expected value of a Bernoulli Distribution?

μ = p

8
New cards

What is the variance of a Bernoulli Distribution?

σ² = p(1-p)

9
New cards

What is a Discrete Binomial Distribution?

This kind of probability distribution describes the number of successes in a fixed number of independent trials, where each trial follows a Bernoulli Distribution.
P(X=k) = (n k)p^k x (1-p)^(n-k) where n is the number of trials, p is the probability of success and (1-p) failure, and n/k is the binomial coefficient calculated like C(n,k), and k is the number of successes we want.

10
New cards

What is the expected value of a discrete binomial distribution?

μ = n x p

11
New cards

What is the variance of a discrete binomial distribution?

n x p(1-p)

12
New cards

What is discrete geometric distribution?

A type of probability distribution that models the number of trials needed to achieve the first success in a sequence of independent, identical trials, where each trial has the same fixed probability of success.
P(X=k) = p(1-p)^(k-1) where success occurs on the kth trial.

13
New cards

What is the mean of the discrete geometric distribution?

μ = 1/p

14
New cards

What is the variance of a discrete geometric distribution?

1-p/p²

15
New cards

What is a negative binomial distribution?

A type of probability distribution that models the number of trials needed to achieve a fixed number of successes ®.

(X=k) = (k-1 r-1)p^r x (1-p)^(k-r)

16
New cards

What is the expected value of a negative binomial distribution?

μ = r/p

17
New cards

What is the variance of a negative binomial distribution?

σ² = r(1-p)/p²

18
New cards

What is Poisson Distribution?

A type of probability distribution that describes the likelihood of a given number of independent events occurring within a fixed interval of time or space.
P(X=k)= (λ^k e^(-λ))/k! where λ is the expected number of occurrences.

19
New cards

What is the expected value/variance of a poisson distribution?

λ.

20
New cards

What is the Poisson approximation of a Binomial Distribution?

It sounds scary but really isn’t. We can use the Poisson distribution to approximate the result of a binomial distribution on the same values, when:

  • n is larger than 20

  • p is smaller than 0.1

  • λ = np is a moderate size.

21
New cards

Instead of doing 1 - P(X=x) with geometric distribution, you can actually use the geometric summation formula. What is this, and why is it useful?

The geometric summation formula is used when you have to do a geometric distribution calculation to calculate the probability of achieving the first success on the kth trial, but you have to find the probability of achieving the first success within the kth to the ith trial. For example:
If there are 10 questions in total, what is the probability that the first correct answer is the answer to one of the last 8 questions.
For this question, you need to find the probability of achieving the first success from the 3rd to the 10th questions.
The expression for this is:
P(X≤n)=∑ [with n on top and k=1 on the bottom)​(1−p)^k−1 x p
Which equates to the equation:
Sn ​= (p-1)² x (1-(p-1)^n)