Expectations and Linearity Notes

Expectations and Linearity

Linearity Property with Multiple Random Variables

  • The expected value of the sum of two random variables is the sum of their expectations.
    E[X + Y] = E[X] + E[Y]

Derivation

  • We want to find E[g(X, Y)], where g(x, y) = x + y.
  • By the expected value rule:
    E[g(X, Y)] = \sum{x} \sum{y} g(x, y) \cdot P_{X,Y}(x, y)
  • In our case:
    E[X + Y] = \sum{x} \sum{y} (x + y) \cdot P_{X,Y}(x, y)

Breaking Down the Sum

  • Separate the double summation into two parts:
    \sum{x} \sum{y} x \cdot P{X,Y}(x, y) + \sum{x} \sum{y} y \cdot P{X,Y}(x, y)
  • In the first term, x is constant with respect to the inner sum over y. Thus:
    \sum{x} x \sum{y} P{X,Y}(x, y) + \sum{x} \sum{y} y \cdot P{X,Y}(x, y)
  • The inner sum is the marginal PMF of X:
    \sum{y} P{X,Y}(x, y) = P_X(x)
  • So the first term simplifies to:
    \sum{x} x \cdot PX(x)
  • Similarly, for the second term:
    \sum{y} y \cdot PY(y)
  • These are the expected values of X and Y, respectively, thus:
    E[X] + E[Y]

Generalization

  • The linearity property extends to any finite number of random variables. Thus:
    E[X1 + X2 + … + Xn] = E[X1] + E[X2] + … + E[Xn]

  • For expressions like:
    E[2X + 3Y - Z]

  • We can apply linearity:
    E[2X] + E[3Y] - E[Z]

  • And then:
    2E[X] + 3E[Y] - E[Z]

Application: Mean of a Binomial Random Variable

  • Let X be a binomial random variable with parameters n and p.

  • X represents the number of successes in n independent trials, each with probability p of success.

  • The PMF of a binomial is:

  • Directly computing the expected value using the PMF is complex.

Indicator Variables

  • Define indicator random variables X_i, where:
    • X_i = 1 if the i-th trial is a success.
    • X_i = 0 otherwise.
  • The total number of successes can be written as:
    X = X1 + X2 + … + X_n

Using Linearity

  • By linearity of expectations:
    E[X] = E[X1] + E[X2] + … + E[X_n]
  • Each E[X_i] is the expected value of a Bernoulli random variable, which is p.
  • Therefore:
    E[X] = n \cdot p

Intuition

  • If p = 0.5 and we toss a coin 100 times, we expect 50 heads.
  • The expected number of successes increases with p and n.

Conclusion

  • Linearity of expectations simplifies problems by breaking them into smaller pieces.
  • It's a tool for analyzing complicated random variables by decomposing them into simpler ones.