Flashcards for Engineering Statistics Lecture Notes
Chapter 6: Point Estimation
6.1 Motivation: Point and Interval Estimations
Major Problem of Statistics: Estimation of a parameter from an experiment involving a random variable $X$.
We have $N$ measurements resulting in a data sample $x1, x2, …, x_N$.
We aim to estimate a parameter $ heta$ (e.g., mean of $X$) characterizing the population, which we can only approximate with a function of the data:
\hat{\theta} = f(x1, x2, \ldots, x_N) \approx \theta.
The process of generating this function and computing values based on data is called estimation.
6.1. Point and Interval Estimations
Estimation can be performed as:
Point Estimation: Achieving a single value $\hat{\theta}$ as the best approximation of $ heta$.
Interval Estimation: Finding an interval $[\hat{\theta}{P1}, \hat{\theta}{P2}]$ where the true value $ heta$ is expected to lie with probability $P$.
Both methods yield approximate results, and characterizing their accuracy is essential.{\hat{\theta}} \
6.2 Point Estimators and Estimates
Concepts:
Estimand: The parameter we want to estimate ($\theta$).
Point Estimator: A statistical rule $(\hat{\theta} = g(x1, x2, \ldots, x_N))$ that provides an estimate from the sample.
Point Estimate: The computed value from the estimator using a specific data sample.
Error and Bias: The error of the point estimator is:
e{\hat{x}} = \hat{\theta}{\hat{x}} - \theta;
the bias is given by:
B(\hat{\theta}) = E[\hat{\theta}] - \theta.
An estimator is unbiased if $B(\hat{\theta}) = 0$.
Minimum Variance: Among the unbiased estimators, the Minimum Variance Unbiased Estimator (MVUE) has the minimum variance:
V(\hat{\theta}). Estimators with lower variance provide better estimates across samples.
6.2. Example of Estimation
Considering a random variable $X$ (e.g., device lifetime), with data sample lifetimes $x1 = 1.7$, $x2 = 2.3$, and $x_3 = 0.8$ years:
The average estimate is:
\hat{\theta} = \frac{x1 + x2 + x_3}{3} = \frac{1.7 + 2.3 + 0.8}{3} = 1.6 \text{ years}.
6.3 Estimators for the Mean and Variance
Estimators for Mean:
Proposition: For a random sample of $X$ with same mean $\mu$, the sample mean is the unbiased estimator:
\hat{\theta} = \bar{X} = \frac{1}{N} \sum{i=1}^{N} Xi.
Estimators for Variance:
Proposition: The unbiased estimator for population variance is:
\hat{\sigma}^2 = \frac{1}{N - 1} \sum{i=1}^{N} (Xi - \bar{X})^2.
6.4 Methods of Point Estimation
Formulation of the Problem:
For an experiment with PDF $f(x; \theta1, \theta2, …)$, objective is to estimate the parameters.
Method of Moments: Equates sample moments to theoretical moments.
Maximum Likelihood Estimation: Based on maximizing the likelihood function given the data sample.
Notes
Various estimators can be generated from the same data set, and their efficiency can vary. Choose the unbiased estimator with the smallest variance whenever possible.
The ambiguity of the best estimator necessitates careful selection based on distribution and sample data.