Week 8/9 - Bayesian and Uncertainty

0.0(0)
studied byStudied by 0 people
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/32

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

33 Terms

1
New cards

Why is reasoning under uncertainty necessary in AI?

Because real-world problems involve partial observability, noisy sensors, uncertain outcomes, and complex dynamics.

2
New cards

What are some alternative methods to probability for handling uncertainty?

Nonmonotonic logic and rules with confidence factors, but these have issues with consistency and combination.

3
New cards

What does a probability model consist of?

A sample space Ω with an assignment of probabilities to each atomic event ω such that ∑P(ω) = 1.

4
New cards

What is a random variable?

A function mapping sample points to a value (e.g., Boolean, discrete, or continuous values).

5
New cards

What is a joint probability distribution?

It specifies the probability of every combination of values for a set of random variables.

6
New cards

How do you compute a marginal distribution from a joint distribution?

By summing the probabilities of atomic events over the irrelevant variables.

7
New cards

What is the formula for conditional probability?

P(A|B) = P(A ∧ B) / P(B), assuming P(B) ≠ 0.

8
New cards

What is Bayes’ Rule?

P(A|B) = P(B|A) × P(A) / P(B). Used to compute diagnostic from causal probability.

9
New cards

What is a common use of Bayes’ Rule in diagnostics?

To compute P(cause|symptom), e.g., P(meningitis|stiff neck).

10
New cards

What is the chain rule of probability?

P(X₁,…,Xn) = Π P(Xi | X₁,…,Xi₋₁)

11
New cards

What is inference by enumeration?

Computing the probability of a query by summing over all atomic events consistent with the query and evidence.

12
New cards

When are two events A and B independent?

If P(A|B) = P(A) or equivalently P(A ∧ B) = P(A) × P(B).

13
New cards

Why is independence useful in probabilistic reasoning?

It reduces the size of the joint distribution from exponential to linear in the number of variables.

14
New cards

What is conditional independence?

X is conditionally independent of Y given Z if P(X|Y,Z) = P(X|Z).

15
New cards

How does conditional independence simplify modeling?

It reduces the number of parameters needed to specify the joint distribution.

16
New cards

What is a naive Bayes model?

A model assuming all effects are conditionally independent given a common cause. Total parameters scale linearly with number of effects.

17
New cards

How was Bayes' rule used in the Air France AF447 case?

Bayesian analysis was used to update the probable wreckage location based on evidence from ocean currents and debris sightings.

18
New cards

What is a Bayesian network?

A graphical model that represents a set of variables and their conditional dependencies via a directed acyclic graph (DAG).

19
New cards

What components make up a Bayesian network?

Nodes (variables), edges (dependencies), and Conditional Probability Tables (CPTs) for each variable given its parents.

20
New cards

What is the global semantics of a Bayesian network?

The joint probability is the product of the local conditional distributions: P(x₁,…,xₙ) = Π P(xᵢ | Parents(xᵢ)).

21
New cards

What is local semantics in Bayesian networks?

Each node is conditionally independent of its non-descendants given its parents.

22
New cards

How should you order variables when constructing a Bayesian network?

From causes to effects to ensure a compact and semantically meaningful structure.

23
New cards

Why is building a Bayesian network from effects to causes problematic?

It creates unnecessary dependencies and increases the size of CPTs, making the model less compact.

24
New cards

How does a Bayesian network improve compactness?

Instead of storing 2ⁿ entries for a full joint distribution, it stores O(n·2ᵏ) if each variable has ≤ k parents.

25
New cards

What is inference by enumeration in Bayesian networks?

A method that computes posterior probabilities by summing over the full joint distribution using the chain rule.

26
New cards

What is the time and space complexity of enumeration?

Time: O(dⁿ), Space: O(n), where d is the domain size and n is the number of variables.

27
New cards

How does inference by variable elimination work?

It sums out variables from right to left, storing intermediate results (factors) to avoid redundant computation.

28
New cards

What is the pointwise product of factors?

Combining two factors over shared variables by multiplying values for each shared assignment.

29
New cards

What makes a variable irrelevant in a Bayesian network query?

If it is not an ancestor of the query variable or evidence, it is irrelevant and can be omitted.

30
New cards

What is an example of variable irrelevance?

In P(JohnCalls | Burglary), MaryCalls is irrelevant since it's not an ancestor of the query or evidence.

31
New cards

How are Bayesian networks used in car diagnosis?

To identify potential causes like a flat battery or broken starter based on observed variables like oil light and gas gauge.

32
New cards

How are Bayesian networks used in insurance?

To estimate risk factors such as accident or theft likelihood based on driver profile and vehicle data.

33
New cards

What are some common tasks in Bayesian network inference?

Simple queries, conjunctive queries, value of information, sensitivity analysis, and explanation.