1/47
A comprehensive set of vocabulary flashcards covering fundamental probability rules, key distributions, estimation, hypothesis testing, regression, and essential calculus rules from STAT 411 Chapters 1–11.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Sample Space
The set of all possible outcomes of an experiment.
Probability Rules
Fundamental properties: P(A) ≥ 0 and P(S) = 1, where S is the sample space.
Addition Rule
For any events A and B, P(A ∪ B) = P(A)+P(B)−P(A ∩ B).
Multiplication Rule (Independent)
If A and B are independent, P(A ∩ B) = P(A)·P(B).
Conditional Probability
The probability of A given B: P(A|B) = P(A ∩ B)/P(B).
Bayes’ Theorem
P(A|B) = [P(B|A)·P(A)]/P(B), reverses conditional probabilities.
Permutations (nPr)
Ordered selections: nPr = n!/(n−r)!.
Combinations (nCr)
Unordered selections: nCr = n!/[r!(n−r)!].
Expected Value E(X)
Long-run mean: Σx·P(x) for discrete X.
Variance Var(X)
Measure of spread: E(X²) − [E(X)]².
Binomial Distribution
Discrete model for number of successes: P(X=k)=nCk p^k (1−p)^(n-k)
Binomial Mean
E(X)=np for Binomial(n,p).
Binomial Variance
Var(X)=np(1−p) for Binomial(n,p).
Poisson Distribution
Counts rare events: P(X=k)=λ^k e^{−λ}/k!.
Poisson Mean & Variance
Both equal λ.
Exponential Distribution
Continuous model for waiting time with rate λ.
Exponential PDF
f(x)=λe^{−λx}, x≥0.
Exponential CDF (Tail)
P(X> x)=e^{−λx}.
Exponential Mean
E(X)=1/λ.
Exponential Variance
Var(X)=1/λ².
Joint PDF f(x,y)
Function giving probability density for pair (X,Y).
Marginal Distribution
Distribution of one variable found by integrating the joint PDF over the other variable.
Independence (Joint)
X and Y independent if f(x,y)=fX(x)·fY(y).
Covariance Cov(X,Y)
E[XY] − E[X]E[Y], measures joint variability.
Correlation ρ
Cov(X,Y)/(σX·σY), standardized measure of linear association.
Confidence Interval for Mean (σ known)
x̄ ± z*·(σ/√n).
Confidence Interval for Mean (σ unknown)
x̄ ± t*·(s/√n).
Confidence Interval for Proportion
p̂ ± z*·√[p̂(1−p̂)/n].
z-test for Mean
(x̄−μ)/(σ/√n), tests population mean with known σ.
t-test for Mean
(x̄−μ)/(s/√n), tests mean when σ unknown.
Proportion z-test
(p̂−p₀)/√[p₀(1−p₀)/n], tests single proportion.
P-value
Probability of observing data as extreme as sample under H₀.
Critical Value Rule
Reject H₀ if test statistic falls in rejection region defined by α.
Two-sample t-test (Unpooled)
t=(x̄₁−x̄₂)/√(s₁²/n₁+s₂²/n₂), unequal variances.
Pooled t-test
Uses pooled variance S_p² and assumes equal variances.
Paired t-test
t=d̄/(s_d/√n), analyzes matched pairs differences.
Two-proportion z-test
z=(p̂₁−p̂₂)/√[p̂(1−p̂)(1/n₁+1/n₂)], where p̂ is pooled proportion.
Simple Linear Regression Model
Y=β₀+β₁X+ε, relates a response to a predictor.
Least Squares Slope β̂₁
β̂₁=Sxy/Sxx, minimizes sum of squared errors.
Least Squares Intercept β̂₀
β̂₀=ȳ−β̂₁x̄.
Coefficient of Determination R²
Proportion of variation explained: SSR/SST.
t-test for Slope
t=β̂₁/SE(β̂₁), tests if β₁=0.
Power Rule (Integration)
∫x^n dx = x^{n+1}/(n+1)+C for n≠−1.
Log Integration Rule
∫x^{−1}dx=ln|x|+C.
Linear Function Integration
∫(ax+b)^n dx=(ax+b)^{n+1}/[a(n+1)]+C.
Exponential Integration
∫e^{ax}dx=(1/a)e^{ax}+C.
Central Limit Theorem (CLT)
For large n, x̄≈N(μ,σ/√n) regardless of population distribution.
Standard Error (SE)
Estimated standard deviation of a statistic, e.g., σ/√n or s/√n.