Chapter 399: Point Estimation

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/25

flashcard set

Earn XP

Description and Tags

Flashcards covering point estimation, properties of estimators, methods for finding estimators, and methods for evaluating point estimators.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

26 Terms

1
New cards

Point Estimation

A statistical method used to provide a single best guess or estimate of an unknown population parameter based on sample data.

2
New cards

Point Estimator

Any function W(X1, X2, …, Xn) of a sample; that is, any statistic is a point estimator. The value obtained from a point estimator is called a point estimate.

3
New cards

Estimator

A function or rule used to calculate an estimate from the sample and is denoted by a statistic (e.g., θ).

4
New cards

Estimate

The actual computed value obtained from the estimator using sample data.

5
New cards

Sample Mean (X̄)

Estimates the population mean (μ).

6
New cards

Sample Proportion (p̂)

Estimates the population proportion (p).

7
New cards

Sample Variance (s²)

Estimates the population variance (σ²).

8
New cards

Sample Standard Deviation (s)

Estimates the population standard deviation (σ).

9
New cards

Unbiasedness

The expected value of the estimator equals the true parameter value, E(θ̂) = θ.

10
New cards

Consistency

As the sample size increases, the estimator converges to the true parameter.

11
New cards

Efficiency

The estimator has the smallest possible variance among all unbiased estimators.

12
New cards

Sufficiency

The estimator captures all necessary information from the sample regarding the parameter.

13
New cards

Method of Moments Estimators

Estimate parameters by equating sample moments to population moments.

14
New cards

Maximum Likelihood Estimators (MLE)

Find the parameter value that maximizes the likelihood function based on observed data.

15
New cards

likelihood function

L(θ|x) = L(θ|x1, …, xn) = ∏ f(xi | θ1, …, θk)

16
New cards

Bayesian Estimators

Use prior distributions and observed data to determine posterior distributions of parameters.

17
New cards

Prior Distribution

A subjective distribution, based on the experimenter's belief, and is formulated before the data are seen.

18
New cards

Posterior distribution

The updated prior using sample information denoted by π(θ|x) = f(x|θ)π(θ) / m(x)

19
New cards

joint distribution

f(x|θ)π(θ)

20
New cards

marginal distribution

m(x) = ∫ f(x|θ)π(θ) dθ if θ is continuous, or m(x) = Σ f(x|θ)π(θ) if θ is discrete.

21
New cards

Conjugate Family

A class IT of prior distributions is conjugate family for & if the posterior distribution is in the class IT for all priors in IT, and all XEX

22
New cards

Mean Squared Error (MSE)

A function of θ defined by MSE(W) = Eθ[(W-θ)²] = Varθ(W) + (Biasθ(W))²

23
New cards

Bias

The difference between the expected value of W and θ: Bias(W) = Eθ[W] - θ

24
New cards

Best Unbiased Estimator

An estimator W* that satisfies EθW* = τ(θ) for all θ and, for any other estimator W with EθW = τ(θ), VarθW* < VarθW for all θ. Also called a uniform minimum variance unbiased estimator (UMVUE).

25
New cards

Cramer-Rao Inequality

Let X1, …, Xn be a sample with pdf f(x/θ) and let w(x) = w(x1, …, xn) be any estimator satisfying d/dθ EθW(X) = d/dθ ∫ [W(x) f(x/θ)] dx and Var(W(x)) < ∞. Then Var(W(x)) >= (d/dθ EθW(x))² / Eθ((∂/∂θ log f(x/θ))²)

26
New cards

Rao-Blackwell Theorem

Let W be any unbiased estimator of τ(θ), and let T be a sufficient statistic for θ. Define φ(T) = E(W|T). Then Eθφ(T) = τ(θ) and Varθφ(T) <= VarθW for all θ; that is, φ(T) is a uniformly better unbiased estimator of τ(θ).