Study Notes on Expected Value and Decision-Making

Expected Value (EV)

  • Definition of Expected Value (EV):

    • The expected value is a key concept in statistics and probability, representing the average outcome of a random variable.

    • Mathematically defined as:
      Ev(X) = ext{P}1 x1 + ext{P}2 x2 + ext{P}3 x3 + … + ext{P}n xn

    • Where (Pi) represents the probability of outcome (xi).

  • General Formula:

    • More formally, the expected value of a discrete random variable can be expressed as:
      Ev(X) = rac{1}{n} imes ext{sum}igg[P(xi) imes xiigg]

    • Where (P(xi)) is the probability of outcome (xi).

  • Application of EV:

    • In decision-making contexts such as gambling or investment, the expected value helps evaluate the potential outcomes of different choices.

Example with Values

  • Win/Lose Scenario:

    • Win Value:

    • Value = 300

    • Lose Value:

    • Value = -100

    • Probabilities:

    • Probability of winning = 20%

    • Probability of losing = 80%

  • Expected Utility (EU):

    • The expected utility can be calculated similarly to expected value.

    • In this case, consider two choices based on given probabilities:

Choices Context

  • Choice A:

    • Guaranteed value of 70k (where k is a thousand). Hence,

    • EU(A) = 70,000 imes 1 = 70,000

  • Choice B:

    • Probability scenario of 70% chance at 100k and a 30% chance at 0k. Hence,

    • Calculation for expected value of choice B:

      • EV(B) = 100,000 imes 0.7 + 0 imes 0.3

      • Which simplifies to:

      • EV(B) = 70,000

  • Comparative Analysis:

    • Comparing both options, both hold the same expected value of 70k.

    • However, variance and risk profiles differ between the two choices, with Choice A being risk-free and Choice B possessing an element of uncertainty.