Memoryless property: Exponential is memoryless: PrT≥t+s∣T≥s=PrT≥t=e−λt.
Practical note: Not a good model for human lifetime (memorylessness implies equal remaining life for people at different ages).
Relation to Poisson process: If events occur as a Poisson process with rate $\lambda$, then the time until the next event T is Exp($\lambda$), with tail $\Pr(T > t) = e^{-\lambda t}$.
Gamma Distribution
Motivation: Waiting time until the n-th event in a Poisson process. Let T_n be this waiting time.
CDF: F<em>T</em>n(t)=Pr(T<em>n≤t)=Pr(at least n events in [0,t])=∑</em>k=n∞k!(λt)ke−λt=1−∑k=0n−1k!(λt)ke−λt.
PDF (Gamma density): for shape $\alpha > 0$ and rate $\lambda > 0$,
g(t) = \frac{\lambda^{\alpha}}{\Gamma(\alpha)} t^{\alpha - 1} e^{-\lambda t}, \quad t > 0.
When $\alpha = 1$, Gamma reduces to the Exponential distribution: Exp(λ)=Γ(1,λ).
Interpretation: The waiting time until the $\alpha$-th event is Gamma with parameters $(\alpha, \lambda)$; the Gamma variable is the sum of $\alpha$ i.i.d. Exp($\lambda$) variables (only strictly true for integer $\alpha$).
Applications: modeling arrival times in Poisson processes, time to failures, positive-valued quantities (rainfalls), and financial quantities like insurance claim sizes.
Gamma function connection: The gamma function is defined as
\Gamma(x) = \int_{0}^{\infty} u^{x-1} e^{-u} \, du, \quad x > 0.
It satisfies the recursive relation Γ(x)=(x−1)Γ(x−1).
For positive integers, Γ(n)=(n−1)!.
Gamma density with two parameters: shape $\alpha > 0$ and rate $\lambda > 0$ is often written as
g(t) = \frac{\lambda^{\alpha}}{\Gamma(\alpha)} t^{\alpha - 1} e^{-\lambda t}, \quad t > 0.
Mean and variance of Gamma$(\alpha, \lambda)$:
E[X]=λα,Var(X)=λ2α.
Important special cases and interpretations:
If $X \sim \Gamma(n, \lambda)$ with integer $n$, then $X$ is the sum of $n$ i.i.d. Exp($\lambda$) variables.
The Gamma distribution generalizes the exponential distribution; when $\alpha=1$ we recover Exp($\lambda$).
Example in seismology: times between microearthquakes may be modeled with Gamma vs Exponential; Gamma often provides better fit when inter-event times exhibit variability beyond Poisson assumptions.
Additional notes on Gamma and related concepts
If $X \sim \Gamma(\alpha, \lambda)$, then the case $\alpha = n$ (a positive integer) corresponds to the sum of $n$ independent Exp($\lambda$) random variables.
The gamma function appears in the normalization constant of the gamma density; it generalizes the factorial to non-integer values.
In practice, the gamma distribution is used for positive-valued data with skewness controlled by the shape parameter $\alpha$; larger $\alpha$ yields more symmetric (approximately normal) shapes when scaled appropriately.
Quick recap and connections to foundational principles
Linearity of expectation: expectations of sums are sums of expectations; transformations of random variables preserve linear relationships under expectation.
Integral representations and Fubini/Tonelli theorem allow swapping order of integration, enabling tail-integral representations.
Normal approximation to binomial relies on central limit intuition: binomial counts behave like normal distributions under appropriate scaling.
Exponential distribution embodies memoryless property, linking to Poisson processes and waiting-time problems.
Gamma distribution connects to Poisson processes as the waiting time to the n-th event; its properties extend the exponential case to sums of independent exponentials.
Key formulas to remember (LaTeX)
Expectation of a transformation: E[g(X)]=∫−∞∞g(x)f(x)dx.