Econ 120A Discrete Probability Distributions

Discrete Probability Distributions

Mean and Variance

  • Population Mean (\mu):
    • Formula: \mu = \sum x p(x)
    • This is the average value of the random variable X, weighted by the probabilities of each value.
  • Population Variance (\sigma^2):
    • Formula: \sigma^2 = \sum (x - \mu)^2 p(x)
    • This measures the spread or dispersion of the random variable X around its mean.

Example: Errors in Economics Textbooks

  • X = number of errors per page in Economics textbooks
xp(x)x . p
00.810
10.170.17
20.020.04
SUM1

Variance Decomposition

  • Formula:
    • \sigma^2 = \sum (x - \mu)^2 p(x)
    • Expanding the square:
      \sigma^2 = \sum (x^2 - 2\mu x + \mu^2) p(x)
    • Distributing p(x):
      \sigma^2 = \sum x^2 p(x) - 2\mu \sum x p(x) + \mu^2 \sum p(x)
    • Since \mu = \sum x p(x) and \sum p(x) = 1:
      \sigma^2 = \sum x^2 p(x) - 2\mu^2 + \mu^2
    • Simplified formula:
      \sigma^2 = \sum x^2 p(x) - \mu^2

Example Calculation

  • Using the errors in economics textbooks example:
    • Var(X) = 0.25 – (0.21)2 = 0.2059

Expectations Operator

  • E(.) Operator:
    • Takes the weighted average of (.), where the weights are the probabilities.
    • E(X) = \sum x p(x) = \mu_X
  • Expected Value:
    • The expected value of a random variable is the weighted average of its possible values, weighted by their probabilities of occurring.
    • Note that this is the same as the mean.
    • The expected value of random variable X is its mean.

Expected Value of Functions of Random Variables

  • If R = g(X), then
    • E(R) = E[g(X)] = \sum g(x) p(x) = \mu_R
    • The expected value of random variable R (a function of the random variable X) is also its mean.

Example Continued

  • Calculating E(X^2)
    • Given X = {x1, x2, \ldots, x_n}
    • E(X^2) = \sum x^2 p(x)
    • Note that E(X^2) = 0.25 \neq [E(X)]^2 = 0.21^2

Expected Value of Squared Deviations from the Mean

  • If g(X) = (X - \mu_X)^2
    • E[g(X)] = \sum g(x) p(x) = \sum (x - \muX)^2 p(x) = \sigmaX^2
    • Thus, E[(X - \muX)^2] = \sigmaX^2
    • The expected value of the squared deviation from the mean of the random variable X is its variance, just like the expected value of X is its mean.

Useful Rules for Expectations

  • If c is any constant:
    1. E(c) = c
    2. E(X + c) = E(X) + E(c) = E(X) + c
    3. E(cX) = c E(X)

Alternative Expression of Variance

  • Using the expectations operator:
    • \sigmaX^2 = E[(X - \muX)^2] = E[X^2 - 2X\muX + \muX^2]
    • Applying linearity of expectation:
      = E[X^2] + E[-2X\muX] + E[\muX^2]
    • = E[X^2] - 2\muX E[X] + E[\muX^2]
    • = E[X^2] - 2\muX \muX + \mu_X^2
    • = E[X^2] - \mu_X^2
  • Thus,
    • \sigmaX^2 = E[(X - \muX)^2] = E[X^2] - \mu_X^2

Summary

  • Population mean of random variable X:
    • \mu_X = \sum x p(x) = E(X)
  • Population variance of random variable X:
    • \sigmaX^2 = \sum (x - \muX)^2 p(x) = \sum x^2 p(x) - \mu_X^2
    • = E[(X - \muX)^2] = E[X^2] - \muX^2

Useful Rules (Revisited)

  • If c is any constant:
    1. E(c) = c
    2. E(X + c) = E(X) + E(c)
    3. E(cX) = c E(X)
    4. Var(X + c) = Var(X)
    5. Var(cX) = c^2 Var(X)

Rules Explained

  1. Expected value of a constant (c):
    • E[c] = c
  2. Expected value of the sum (X + c):
    • E[X + c] = \sum (x + c) p(x) = \sum x p(x) + \sum c p(x) = E[X] + c \sum p(x)
    • Since \sum p(x) = 1:
      E[X + c] = E[X] + c = \mu_X + c
  3. Expected value of the product (cX):
    • E[cX] = \sum c x p(x) = c \sum x p(x) = c E[X] = c \mu_X
  4. Variance of the sum (X + c):
    • Var(X + c) = \sum ((x + c) - (\muX + c))^2 p(x) = \sum (x - \muX)^2 p(x) = Var(X)
  5. Variance of the product (cX):
    • Var(cX) = \sum (cx - c \muX)^2 p(x) = c^2 \sum (x - \muX)^2 p(x) = c^2 Var(X)