Engineering Statistics - Continuous Random Variables
Continuous Random Variables
Definition of Continuous Random Variable
- A random variable $X$ is called continuous if it can take any real value in a given range, making the possible values not finite or countable.
- Examples include:
- Time spent in traffic
- Body temperature of a patient
- Velocity of an atom
Probability Density Function (PDF) and Cumulative Distribution Function (CDF)
- The PDF $f(x)$ defines the probability density of the continuous variable.
- The CDF $F(x)$ can be expressed using the PDF:
F(x) = P(X \leq x) = \int_{-\infty}^x f(v) \, dv - Key properties of the PDF and CDF:
- $F(x)$ is a non-decreasing function.
- The normalization condition must be satisfied:
\int_{-\infty}^{\infty} f(v) \, dv = 1 - The derivative of the CDF gives the PDF:
f(x) = \frac{dF}{dx} - The probability of an interval $[a,b]$ can be computed:
P(a \leq X \leq b) = F(b) - F(a) - For any specific point, the probability is zero:
P(X = b) = 0
- The $p$th percentile, denoted $\etap$, is the value of $X$ such that:
P(X \leq \etap) = p
- The median (50th percentile) is given by:
\muX = \eta{0.5}
Mean and Variance of a Continuous Random Variable
- Mean $\muX$:
\muX = E[X] = \int_{-\infty}^{\infty} x f(x) \, dx
- Variance $\sigma^2X$:
\sigma^2X = V[X] = E[X^2] - (E[X])^2 = \int{-\infty}^{\infty} (x - \muX)^2 f(x) \, dx
Common Continuous Distributions
- Uniform Distribution: PDF is constant over the range $[a,b]$:
f(x) = \begin{cases} \frac{1}{b-a}, & a \leq x \leq b \ 0, & \text{otherwise} \end{cases} - Normal Distribution: Bell-shaped PDF given by:
f(x) = \frac{1}{\sqrt{2\pi\sigma^2}} e^{-\frac{(x - \mu)^2}{2\sigma^2}}
where $\mu$ is the mean and $\sigma$ is the standard deviation. - Lognormal Distribution: PDF:
f(x) = \frac{1}{x \sigma \sqrt{2\pi}} e^{-\frac{(\ln x - \mu)^2}{2\sigma^2}}
only for $x > 0$. - Exponential Distribution: PDF:
f(x) = \lambda e^{-\lambda x}, \; x \geq 0
where $\lambda$ is the rate parameter. - Gamma Distribution: PDF for shape $\alpha$ and scale $\beta$:
f(x) = \frac{1}{\Gamma(\alpha) \beta^{\alpha}} x^{\alpha - 1} e^{-\frac{x}{\beta}}, \; x \geq 0 - Weibull Distribution: PDF:
f(x) = \frac{\alpha}{\beta} \left(\frac{x}{\beta}\right)^{\alpha - 1} e^{-\left(\frac{x}{\beta}\right)^{\alpha}} , \; x \geq 0
Function of a Random Variable
- If $Y = h(X)$, where $h$ is a differentiable function:
- The PDF can be derived from the transformation:
fY(y) = fX(h^{-1}(y)) \cdot |h'(h^{-1}(y))|
- The mean and variance of $Y$ can be derived from the functions of $X$:
- $E[Y] = E[h(X)]$ and for variance use similar transformations.
Probability Plots
- Useful for checking if a sample comes from a particular distribution by plotting the empirical versus theoretical percentiles.
- If the points form a straight line, the assumption holds true; substantial deviations indicate a poor fit.