1/21
These flashcards cover key concepts from the lecture on stochastic sequences and concentration inequalities, focusing on definitions and fundamental principles.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
Chebyshev's Inequality
A statistical theorem that provides bounds on the probability that the value of a random variable deviates from its mean.
IID (Independent and Identically Distributed)
A property of a sequence of random variables where each variable has the same probability distribution and all are mutually independent.
Stationary Stochastic Sequence
A stochastic sequence is stationary if its statistical properties do not change over time.
Markov Inequality
A bound on the probability that a non-negative random variable is at least a given value, related to its expected value.
Concentration Inequality
An inequality that quantifies how concentrated values of a random variable are around some value or mean.
Hoeffding's Inequality
An inequality that gives an upper bound on the probability that the sum of independent bounded random variables deviates from its expected value.
Missing Completely at Random (MCAR)
A situation in which the missingness of data is independent of observed and unobserved data.
Missing at Random (MAR)
A situation where the missingness of data is related to the observed data but not the unobserved data.
Missing Not at Random (MNAR)
A situation where the missingness is related to the unobserved data.
Variance
A measure of the dispersion of a set of values, describing how much the values deviate from the mean.
Law of Large Numbers (LLN)
A theorem stating that as the number of trials increases, the sample mean of a sequence of IID random variables converges to their expected value.
Central Limit Theorem (CLT)
A theorem stating that the distribution of sample means of a large number of IID random variables will be approximately normal, regardless of the original distribution.
Expected Value (Mean)
The long-run average value of a random variable, representing the sum of all possible values each multiplied by its probability of occurrence.
Standard Deviation
A measure of the amount of variation or dispersion of a set of values, equal to the square root of the variance.
Covariance
A measure of how much two random variables change together, indicating the direction of their linear relationship.
Correlation Coefficient
A standardized measure of the linear relationship between two variables, ranging from -1 to 1.
Bias (Statistics)
The difference between the expected value of an estimator and the true value of the parameter it is estimating.
Consistency (Statistics)
A property of an estimator which means that as the sample size increases, the estimator converges in probability to the true value of the parameter.
Efficiency (Statistics)
A measure indicating how close an estimator is to the true value of the parameter, evaluated by its variance; a more efficient estimator has smaller variance.
Maximum Likelihood Estimation (MLE)
A method of estimating the parameters of a statistical model by finding the parameter values that maximize the likelihood function, i.e., making the observed data most probable.
Hypothesis Testing
A statistical method used to determine if there is enough evidence in a sample of data to infer that a certain condition is true for an entire population. It involves formulating a null and an alternative hypothesis.
Confidence Interval
A range of values, derived from sample statistics, that is likely to contain the true value of an unknown population parameter with a certain level of confidence.