Week 13: Central Limit Theorem and Moment-Generating Functions
Weak Law of Large Numbers
- Setup: Let $X1, X2,
..\, Xi, \dots$ be a sequence of i.i.d. random variables with and - Define the sample mean:
- Weak law statement: For any $\varepsilon > 0$,
\lim{n\to\infty} \Pr{|Xn - \mu| > \varepsilon} = 0. - Proof sketch (Chebyshev): Since $\mathbb{E}(Xn) = \mu$ and
Chebyshev’s inequality gives
Strong Law of Large Numbers
- Setup: Let $X1, X2, \dots$ be i.i.d. with and
- Define $Xn = \frac{1}{n} \sum{i=1}^n X_i.$
- Strong law statement:
- Interpretation: Large deviations from the mean become impossible almost surely as $n$ grows; a stronger claim than WLLN.
Central Limit Theorem (CLT): introduction
- Setup: $X1, X2, \dots$ i.i.d. with mean $\mu$ and variance $\sigma^2$; let $Sn = \sum{i=1}^n Xi$ and $Xn = S_n/n$.
- Classical normalization:
- CLT: $Zn$ converges in distribution to the standard normal $N(0,1)$: \lim{n\to\infty} \Pr\left(\frac{S_n - n\mu}{\sigma\sqrt{n}} \le x\right) = \Phi(x), \qquad -\infty < x < \infty.
Central Limit Theorem (mgf form)
- With $M$ the mgf of $X$ existing in a neighbourhood of $0$, CLT can also be written as
- (mgf condition): The mgf $M_X(t) = \mathbb{E}(e^{tX})$ exists in a neighbourhood of $0$.
Moment-Generating Function (MGF)
- Definition: The MGF of a random variable $X$ is
- Existence: $M_X(t)$ may or may not exist for a given $t$; if it exists near $t=0$ it is useful for moments.
- Discrete form: if $X$ takes values in a discrete set with pmf $p(x)$, then
- Continuous form: if $X$ has pdf $f(x)$, then
Properties of MGFs
- The mgf, when it exists in a neighbourhood of $0$, uniquely determines the distribution.
- Derivatives at zero give moments:
- Transformation: If $Y = a + bX$, then
MGFs of common distributions
Binomial$(n,p)$:
- Moments:
- Variance:
Poisson$(\lambda)$:
- Moments:
- Variance:
Exponential$(\lambda)$:
M_X(t) = \frac{\lambda}{\lambda - t}, \quad t < \lambda.- Moments:
- Variance:
Standard Normal$(0,1)$:
- Moments:
- Variance:
MGF for general Normal distribution
- If $X \sim N(\mu, \sigma^2)$, then
- Derivatives:
- Moments:
- Variance:
Basic normal transformation result
- If $Y = a + bX$ with $X \sim N(\mu, \sigma^2)$, then $Y \sim N(a + b\mu, b^2 \sigma^2)$ as implied by mgf: $MY(t) = e^{a t} MX(bt)$.
Sum of independent random variables (MGF property)
- Theorem: If $X$ and $Y$ are independent with mgfs $MX$ and $MY$ and $Z = X + Y$, then
on the common interval where both mgfs exist. - Sketch of proof:
using independence.
Examples: Sum of independent Poisson and Normal distributions
Poisson + Poisson: If $X\sim\text{Poisson}(\lambda)$ and $Y\sim\text{Poisson}(\mu)$ are independent, then $X+Y \sim \text{Poisson}(\lambda+\mu)$.
- MGFs:
Normal + Normal: If $X \sim N(\mu, \sigma^2)$ and $Y \sim N(\nu, \tau^2)$ are independent, then
- MGFs:
Remarks about mgfs
- Joint mgf: For random variables $(X,Y)$, the joint mgf is
- Independence criterion via mgfs: $X$ and $Y$ are independent iff for all $(s,t)$ in a neighbourhood of $(0,0)$.
- Limitation: The mgf may not exist for all distributions or all $t$; this limits applicability.
Central Limit Theorem (recall) and mgf assumptions
- Recall: If $X1, X2, \dots$ are i.i.d. with mean $\mu$ and variance $\sigma^2$, and mgf exists in a neighbourhood of zero, then with $Sn = \sum{i=1}^n Xi$, \lim{n\to\infty} \Pr\left( \frac{S_n - n\mu}{\sigma\sqrt{n}} \le x \right) = \Phi(x), \quad -\infty < x < \infty.
- The mgf existence assumption is a strong one; there are many versions of the CLT.
Continuity Theorem (convergence of mgfs implies distribution convergence)
- Let $Fn$ be CDFs with mgfs $Mn$, and let $F$ be a CDF with mgf $M$. If $Mn(t) \to M(t)$ for all $t$ in an open interval containing 0, then $Fn(x) \to F(x)$ at all continuity points of $F$.
- Application: Since the CDF of $N(0,1)$ is continuous, it suffices to show that the mgf of $S_n$ converges to the mgf of $N(0,1)$.
Proof sketch of the CLT via mgf (standardized version)
- Aim: Prove the standard CLT for μ = 0 and σ = 1.
- Define $Zn = Sn / \sqrt{n}$. Since $Sn$ is a sum of independent variables, its mgf is
- For $Zn$, the mgf is
- Expand $MX(s)$ around $s=0$:
as $s \to 0$. - With $\mathbb{E}(X) = 0$, we have $M'X(0) = 0$, and with standardization $M''X(0) = 1$; hence
- Substitute $s = t/\sqrt{n}$:
- The limit mgf $e^{t^2/2}$ is the mgf of $N(0,1)$, so $Z_n$ converges in distribution to $N(0,1)$.
Practical takeaways and exam relevance
- The CLT explains why sums of many iid, moderate-variance variables tend to look normal, regardless of the original distribution, once properly centered and scaled.
- MGFs provide a powerful, compact way to compute moments and to prove distributional convergence via the Continuity Theorem.
- The mgf approach also yields closed-form results for sums of independent Poisson and normal variables, and shows how transformation and independence interact in distributional properties.
Endnotes from the slides
- The slides emphasize that the mgf method is a tool for theory and for later courses; some mgf topics are not covered on the final exam.
- The presented versions of CLT use mgf existence as a condition and acknowledge multiple versions exist.
- The Continuity Theorem connects mgf convergence to distribution convergence and is a key step in mgf-based proofs of CLT.
Source alignment (slide references):
- WLLN and proof via Chebyshev (Page 2)
- SLLN statement (Page 3)
- CLT normalization and statement (Pages 4–5)
- CLT examples with dice and uniform variables (Pages 6–7)
- MGFs: definition and basic properties (Pages 8–10)
- MGFs of Binomial, Poisson, Exponential, Normal (Pages 11–20)
- Sum of independent variables and examples (Pages 21–23)
- Remarks on mgf, joint mgf, and independence (Page 24)
- CLT recall, Continuity Theorem, and mgf-based proofs (Pages 25–28)
- Final remarks about mgf applications and exam scope (Pages 29–31)