REQ READING Statistical and Causal Models – Key Vocabulary

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/32

flashcard set

Earn XP

Description and Tags

Vocabulary flashcards summarizing essential terms and definitions from the lecture notes on statistical learning, causal models, and illustrative examples.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

33 Terms

1
New cards

Statistical learning

The field that infers properties of an unknown probability distribution from observed data, typically for prediction.

2
New cards

Causal inference

The study of identifying and quantifying cause-and-effect relationships, often involving multiple distributions produced by interventions.

3
New cards

Probability space

A mathematical model (Ω,𝔽,𝑃) consisting of outcomes, events, and a probability measure for a random experiment.

4
New cards

Independent and identically distributed (i.i.d.)

An assumption that each sample is drawn independently from the same joint distribution.

5
New cards

Regression (conditional expectation)

The function f(x)=E[Y|X=x] giving the expected output value for a given input.

6
New cards

Binary classifier

A function that assigns each input x to the more likely class y∈{−1,+1} under P(Y|X=x).

7
New cards

Joint distribution (P_{X,Y})

The probability law governing the simultaneous behavior of random variables X and Y.

8
New cards

Empirical distribution (P_n)

A discrete distribution that puts equal mass 1/n on each observed data point in a sample.

9
New cards

Inverse problem (statistics)

Estimating properties of an unobserved distribution from data generated by that distribution.

10
New cards

Function class / Hypothesis space

The set of candidate functions from which a learning algorithm selects its predictor.

11
New cards

Capacity (of a function class)

A measure of how rich or complex a hypothesis space is, controlling overfitting potential.

12
New cards

Vapnik–Chervonenkis (VC) dimension

A combinatorial capacity measure indicating the largest set of points that can be shattered by a function class.

13
New cards

Expected risk (true risk)

The population loss R[f]=∫(1/2)|f(x)−y| dP_{X,Y}(x,y) measuring generalization error.

14
New cards

Empirical risk

The average loss on the training sample Remp^n[f]=(1/n)∑{i}(1/2)|f(xi)−yi|.

15
New cards

Empirical Risk Minimization (ERM)

The principle of choosing the hypothesis that minimizes empirical risk over the training data.

16
New cards

Consistency (of a learner)

Property that the risk of the learned functions converges to the minimal achievable risk as n→∞.

17
New cards

Universal consistency

A guarantee that, for every fixed underlying distribution, the algorithm approaches Bayes-optimal risk with enough data.

18
New cards

Slow learning rates

Situations where convergence toward optimal risk can be arbitrarily slow for some problems, even with consistent algorithms.

19
New cards

Regularization

A technique that restricts or penalizes complex hypotheses to control capacity and improve generalization.

20
New cards

Bayesian prior

A probability distribution placed over hypotheses or parameters expressing a priori beliefs before seeing data.

21
New cards

Observational distribution

The joint distribution of variables obtained without intervening in the system.

22
New cards

Intervention

An external action that forces a variable to take specific values, potentially altering the joint distribution.

23
New cards

Structural Causal Model (SCM)

A collection of assignments X:=f(paX, NX) defining each variable as a function of its parents and an independent noise term.

24
New cards

Causal reasoning

Deriving implications (e.g., effects of interventions) from a known causal model.

25
New cards

Causal learning / Structure learning

Inferring aspects of the underlying causal graph or mechanisms from data (observational or interventional).

26
New cards

Reichenbach's common cause principle

If X and Y are dependent, there exists a variable Z that causally influences both and renders them independent when conditioned upon.

27
New cards

Confounder

A variable that causally affects two or more variables, creating spurious associations between them.

28
New cards

Screening-off

The property that conditioning on a confounder Z makes its effects (e.g., X and Y) statistically independent: X ⫫ Y | Z.

29
New cards

Correlation ≠ Causation

The principle that statistical dependence alone does not determine causal direction or presence.

30
New cards

Mechanism (in SCM)

The deterministic function linking a variable to its direct causes and noise term in an SCM.

31
New cards

Additive Noise Model (ANM)

A causal model where a child variable equals a function of its parent plus independent additive noise.

32
New cards

Optical character recognition example

Illustration that identical PX,Y for images and labels can arise from different causal structures, yielding different intervention effects.

33
New cards

Gene perturbation example

Scenario showing that deleting a gene (intervention) affects phenotype only if a causal, not merely correlated, relationship exists.