Lecture 2: Uncertainty

0.0(0)
studied byStudied by 0 people
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/26

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

27 Terms

1
New cards

ω

represents every possible situation can be thought of as a world; its probability is P(ω)

2
New cards


0 < P(ω) < 1

every value representing probability must range between 0 and 1.

<p><span>every value representing probability must range between 0 and 1.</span></p>
3
New cards
term image

The probabilities of every possible event, when summed together, are equal to 1.

4
New cards

Unconditional Probability

the degree of belief in a proposition in the absence of any other evidence

5
New cards

Conditional Probability

the degree of belief in a proposition given some evidence that has already been revealed

<p><span>the degree of belief in a proposition given some evidence that has already been revealed</span></p>
6
New cards

Random Variable

a variable in probability theory with a domain of possible values that it can take on

7
New cards

Independence

the knowledge that the occurrence of one event does not affect the probability of the other event; events a and b are independent if and only if the probability of a and b is equal to the probability of a times the probability of b: P(a ∧ b) = P(a)P(b).

8
New cards

Bayes Rule

commonly used in probability theory to compute conditional probability; says that the probability of b given a is equal to the probability of a given b, times the probability of b, divided by the probability of a.

<p><span>commonly used in probability theory to compute conditional probability; </span>says that the probability of <em>b</em> given <em>a</em> is equal to the probability of <em>a</em> given <em>b</em>, times the probability of <em>b</em>, divided by the probability of <em>a</em>.</p>
9
New cards

Joint Probability

the likelihood of multiple events all occurring.

10
New cards


Negation

P(¬a) = 1 - P(a); Probability Rule that stems from the fact that the sum of the probabilities of all the possible worlds is 1, and the complementary literals a and ¬a include all the possible worlds

11
New cards

Inclusion-Exclusion

P(a ∨ b) = P(a) + P(b) - P(a ∧ b); Probability Rule that can interpreted in the following way: the worlds in which a or b are true are equal to all the worlds where a is true, plus the worlds where b is true. However, in this case, some worlds are counted twice (the worlds where both a and b are true)). To get rid of this overlap, we subtract once the worlds where both a and b are true (since they were counted twice).

12
New cards

Marginalization

P(a) = P(a, b) + P(a, ¬b); Probability Rule where The idea here is that b and ¬b are disjoint probabilities. That is, the probability of b and ¬b occurring at the same time is 0. We also know b and ¬b sum up to 1. Thus, when a happens, b can either happen or not. When we take the probability of both a and b happening in addition to the probability of a and ¬b, we end up with simply the probability of a.

13
New cards

Marginalization (Random Variables)

  • Left side: “The probability of random variable X having the value xᵢ.”

  • Right Side = idea of marginalization

<ul><li><p>Left side: <span>“The probability of random variable X having the value xᵢ.”</span></p></li><li><p><span>Right Side = idea of marginalization</span></p></li></ul><p></p>
14
New cards

Conditioning

P(a) = P(a | b)P(b) + P(a | ¬b)P(¬b); probability rule with a similar idea to marginalization. The probability of event a occurring is equal to the probability of a given b times the probability of b, plus the probability of a given ¬b time the probability of ¬b.


15
New cards

Bayesian Networks

a data structure that represents the dependencies among random variables

<p><span>a data structure that represents the dependencies among random variables </span></p>
16
New cards

Query X

Inference property; the variable for which we want to compute the probability distribution.

17
New cards

Evidence variables E

Inference Property; one or more variables that have been observed for event e

18
New cards

Hidden variables Y

Inference property; variables that aren’t the query and also haven’t been observed.

19
New cards

The goal

Inference property; calculate P(X | e).

20
New cards

Inference by enumeration

a process of finding the probability distribution of variable X given observed evidence e and some hidden variables Y.

<p><span>a process of finding the probability distribution of variable X given observed evidence e and some hidden variables Y.</span></p>
21
New cards

Sampling

a technique of approximate inference where each variable is sampled for a value according to its probability distribution.

22
New cards

Likelihood Weighting

knowt flashcard image
23
New cards

The Markov assumption

an assumption that the current state depends on only a finite fixed number of previous states.

24
New cards

Markov Chain

a sequence of random variables where the distribution of each variable follows the Markov assumption. That is, each event in the chain occurs based on the probability of the event before it.

25
New cards

Hidden Markov Model

a type of a Markov model for a system with hidden states that generate some observed event. This means that sometimes, the AI has some measurement of the world but no access to the precise state of the world. In these cases, the state of the world is called the hidden state and whatever data the AI has access to are the observations

26
New cards

Sensor Markov Assumption

assumes only the hidden state affects an observation

27
New cards

Tasks that can be completed with hidden Markov Models

knowt flashcard image