Study Notes on Expected Value and Decision-Making
Expected Value (EV)
Definition of Expected Value (EV):
The expected value is a key concept in statistics and probability, representing the average outcome of a random variable.
Mathematically defined as:
Where (Pi) represents the probability of outcome (xi).
General Formula:
More formally, the expected value of a discrete random variable can be expressed as:
Ev(X) = rac{1}{n} imes ext{sum}igg[P(xi) imes xiigg]Where (P(xi)) is the probability of outcome (xi).
Application of EV:
In decision-making contexts such as gambling or investment, the expected value helps evaluate the potential outcomes of different choices.
Example with Values
Win/Lose Scenario:
Win Value:
Value = 300
Lose Value:
Value = -100
Probabilities:
Probability of winning = 20%
Probability of losing = 80%
Expected Utility (EU):
The expected utility can be calculated similarly to expected value.
In this case, consider two choices based on given probabilities:
Choices Context
Choice A:
Guaranteed value of 70k (where k is a thousand). Hence,
Choice B:
Probability scenario of 70% chance at 100k and a 30% chance at 0k. Hence,
Calculation for expected value of choice B:
Which simplifies to:
Comparative Analysis:
Comparing both options, both hold the same expected value of 70k.
However, variance and risk profiles differ between the two choices, with Choice A being risk-free and Choice B possessing an element of uncertainty.