14 - Statistical Inference_WB
Parameter Estimation
Focus on methods to infer parameters based on data.
Key approaches:
Method of Moments (MoM)
Maximum Likelihood Estimation (ML)
Maximum a Posteriori (MAP)
Bayesian approach.
Method of Moments
Uses moments of a dataset to estimate parameters.
Sample means and variances relate to the actual moments of the distribution.
Equate sample statistics to distribution moments and solve for parameters.
Maximum Likelihood Estimation (ML)
Maximizes the likelihood function to estimate parameters.
Likelihood function is defined as the probability of data given the parameters.
If data points are i.i.d., likelihood can be expressed as a product.
Example provided in the context of clickthrough rates in web ad analysis.
Maximum a Posteriori (MAP)
Addresses limitations of ML by incorporating prior beliefs about parameters.
Combines likelihood with prior probabilities, aiming to maximize posterior distribution.
Example considers a Bernoulli distribution for click rates, integrating prior probabilities to refine estimates.
Ridge Regression
Extension of linear regression incorporating a regularization term.
Helps prevent overfitting by shrinking coefficient estimates.
Involves MAP estimation with a Gaussian prior on weights.
Comparing Linear vs Ridge Regression
Demonstrates how ridge regression leads to smaller coefficient estimates (shrinkage effect) compared to standard linear regression, especially with small sample sizes or correlated features.
Summary of Parameter Estimation Methods
Different estimation techniques serve various scenarios in modeling.
Ongoing studies into Bayesian approaches will be covered in the next lecture.