Lecture Notes on Probability and Decision-Making

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/53

flashcard set

Earn XP

Description and Tags

These flashcards cover key concepts and definitions related to probability, decision-making under uncertainty, and statistical inference.

Last updated 3:47 AM on 4/27/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

54 Terms

1
New cards

Probability

A numerical value that measures the likelihood that an event occurs.

2
New cards

Addition Rule

A probability rule that states the probability of the union of two events is the sum of their individual probabilities minus the probability of their intersection.

3
New cards

Multiplication Rule

A probability rule that states the probability of the intersection of two events is the product of the probability of one event and the conditional probability of the other event given the first.

4
New cards

Bayes’ Theorem

A formula used to update probabilities based on new evidence.

5
New cards

Frequentist Interpretation

Assumes that probabilities represent proportions of specific events occurring over infinitely identical trials.

6
New cards

Bayesian Interpretation

Assumes that probabilities are subjective beliefs about the relative likelihood of events.

7
New cards

Experiment

A process that leads to one of several outcomes.

8
New cards

Outcome

The result of an experiment.

9
New cards

Sample Space

The set of all possible outcomes of an experiment.

10
New cards

Event

A subset of the sample space.

11
New cards

Mutually Exclusive Events

Events that do not share any common outcomes.

12
New cards

Exhaustive Events

Events that include all outcomes in the sample space.

13
New cards

Venn Diagram

A visual representation of sets and their relationships.

14
New cards

Complement Rule

States that the probability of an event plus the probability of its complement equals 1.

15
New cards

Conditional Probability

The probability of an event given that another event has occurred.

16
New cards

Independent Events

Events where the occurrence of one does not affect the occurrence of the other.

17
New cards

Variance

A measure of the dispersion of a set of values.

18
New cards

Random Variable

A numerical outcome from a probabilistic experiment.

19
New cards

Expected Value

A measure of the central tendency of a random variable.

20
New cards

Discrete Uniform Distribution

A distribution where each outcome is equally likely.

21
New cards

Binomial Distribution

Describes the number of successes in a sequence of independent Bernoulli trials.

22
New cards

Hypergeometric Distribution

Describes draws from a finite population without replacement.

23
New cards

Poisson Distribution

Estimates the number of successes over a specified interval of time or space.

24
New cards

Probability Mass Function (PMF)

Function that gives the probability that a discrete random variable is exactly equal to some value.

25
New cards

Probability Density Function (PDF)

Function that describes the likelihood of a continuous random variable taking on a particular value.

26
New cards

Standard Normal Distribution

A normal distribution with a mean of 0 and a standard deviation of 1.

27
New cards

Sampling Error

The difference between a sample statistic and the corresponding population parameter.

28
New cards

Statistical Inference

The process of drawing conclusions about a population based on sample data.

29
New cards

Central Limit Theorem (CLT)

States that the sampling distribution of the sample mean approaches a normal distribution as the sample size increases.

30
New cards

Unbiased Estimator

An estimator whose expected value equals the true parameter being estimated.

31
New cards

Sampling Distribution

The probability distribution of a statistic obtained from a larger population.

32
New cards

Consistency

The property that ensures that as the sample size increases, the sample mean converges in probability to the population mean.

33
New cards

Precision

As the sample size increases, the sample mean becomes a more reliable estimate of the population mean.

34
New cards

Expected Value of a Random Variable

Calculated as the sum of all possible values, each multiplied by the probability of its occurrence.

35
New cards

Standard Error of the Mean

The standard deviation of the sampling distribution of the sample mean.

36
New cards

Quantiles

Values that divide a probability distribution into segments of equal probability.

37
New cards

Joint Probability Table

A table that displays the probability of two events occurring simultaneously.

38
New cards

Marginal Probability

The probability of a single event occurring without regard to other variables.

39
New cards

Cumulative Distribution Function (CDF)

A function that describes the probability that a random variable takes on a value less than or equal to a specific value.

40
New cards

Hypothesis Testing

A statistical method to determine whether a hypothesis about a population parameter is likely true.

41
New cards

Type I Error

The error of rejecting a true null hypothesis.

42
New cards

Type II Error

The error of failing to reject a false null hypothesis.

43
New cards

Power of a Test

The probability that it correctly rejects a false null hypothesis.

44
New cards

Probability

A numerical value that measures the likelihood that an event occurs. Formula: P(A)=Number of favorable outcomesTotal number of outcomesP(A) = \frac{\text{Number of favorable outcomes}}{\text{Total number of outcomes}}
Example Problem: If a die is rolled, what is the probability of rolling a 4?
Solution: There is 1 favorable outcome (rolling a 4) out of 6 total outcomes. Thus, P(4)=16P(4) = \frac{1}{6}.

45
New cards

Addition Rule

A probability rule that states the probability of the union of two events is the sum of their individual probabilities minus the probability of their intersection. Formula: P(AB)=P(A)+P(B)P(AB)P(A \cup B) = P(A) + P(B) - P(A \cap B)
Example Problem: If P(A) = 0.3 and P(B) = 0.5, with P(A ∩ B) = 0.1, what is P(A ∪ B)?
Solution: P(AB)=0.3+0.50.1=0.7P(A \cup B) = 0.3 + 0.5 - 0.1 = 0.7.

46
New cards

Multiplication Rule

A probability rule that states the probability of the intersection of two events is the product of the probability of one event and the conditional probability of the other event given the first. Formula: P(AB)=P(A)×P(BA)P(A \cap B) = P(A) \times P(B \mid A)
Example Problem: If P(A) = 0.4 and P(B \mid A) = 0.5, what is P(A ∩ B)?
Solution: P(AB)=0.4×0.5=0.2P(A \cap B) = 0.4 \times 0.5 = 0.2.

47
New cards

Conditional Probability

The probability of an event given that another event has occurred. Formula: P(AB)=P(AB)P(B)P(A \mid B) = \frac{P(A \cap B)}{P(B)}
Example Problem: If P(A ∩ B) = 0.02 and P(B) = 0.1, what is P(A | B)?
Solution: P(AB)=0.020.1=0.2P(A \mid B) = \frac{0.02}{0.1} = 0.2.

48
New cards

Bayes’ Theorem

A formula used to update probabilities based on new evidence. Formula: P(AB)=P(BA)×P(A)P(B)P(A \mid B) = \frac{P(B \mid A) \times P(A)}{P(B)}
Example Problem: If P(B | A) = 0.9, P(A) = 0.3, and P(B) = 0.5, what is P(A | B)?
Solution: P(AB)=0.9×0.30.5=0.54P(A \mid B) = \frac{0.9 \times 0.3}{0.5} = 0.54.

49
New cards

Law of Total Probability

Relates marginal probabilities to conditional probabilities with a partition of the sample space. Formula: P(B)=P(BAi)P(Ai)P(B) = \sum P(B \mid A_i) P(A_i) where {A_i} is a partition of the sample space.
Example Problem: If P(B | A1) = 0.2, P(B | A2) = 0.5, P(A1) = 0.4, and P(A2) = 0.6, what is P(B)?
Solution: P(B)=(0.2×0.4)+(0.5×0.6)=0.08+0.30=0.38P(B) = (0.2 \times 0.4) + (0.5 \times 0.6) = 0.08 + 0.30 = 0.38.

50
New cards

Complement Rule

States that the probability of an event plus the probability of its complement equals 1. Formula: P(A)+P(A)=1P(A) + P(A') = 1
Example Problem: If P(A) = 0.7, what is P(A')?
Solution: P(A)=10.7=0.3P(A') = 1 - 0.7 = 0.3.

51
New cards

Expected Value

A measure of the central tendency of a random variable. Formula: E(X)=xiP(xi)E(X) = \sum x_i P(x_i) where x_i are all possible values.
Example Problem: If there are outcomes of 2 with probability 0.3, 4 with probability 0.5, and 6 with probability 0.2, what is E(X)?
Solution: E(X)=(2×0.3)+(4×0.5)+(6×0.2)=0.6+2.0+1.2=3.8E(X) = (2 \times 0.3) + (4 \times 0.5) + (6 \times 0.2) = 0.6 + 2.0 + 1.2 = 3.8.

52
New cards

Standard Error of the Mean

The standard deviation of the sampling distribution of the sample mean. Formula: SE=snSE = \frac{s}{\sqrt{n}} where ss is the standard deviation and nn is the sample size.
Example Problem: If the standard deviation s = 10 and n = 25, what is SE?
Solution: SE=1025=105=2SE = \frac{10}{\sqrt{25}} = \frac{10}{5} = 2.

53
New cards

Cumulative Distribution Function (CDF)

A function that describes the probability that a random variable takes on a value less than or equal to a specific value. Formula: F(x)=P(Xx)F(x) = P(X \leq x)
Example Problem: If a random variable X is distributed uniformly between 0 and 1, what is F(0.5)?
Solution: For a uniform distribution, F(0.5)=0.5F(0.5) = 0.5 (50% of the distribution lies below 0.5).

54
New cards

Power of a Test

The probability that it correctly rejects a false null hypothesis. Formula: Power=1P(Type II error)\text{Power} = 1 - P(\text{Type II error})
Example Problem: If the probability of a Type II error is 0.2, what is the power of the test?
Solution: Power=10.2=0.8\text{Power} = 1 - 0.2 = 0.8.