Artificial Intelligence

studied byStudied by 9 people
0.0(0)
Get a hint
Hint

Decision Tree

1 / 98

flashcard set

Earn XP

Description and Tags

99 Terms

1

Decision Tree

can be used to visually and explicitly represent decisions and decision making

New cards
2

Decision Tree

Build a _______ for classifying

New cards
3

Decision Tree

It utilizes supervised learning, batch processing of training

New cards
4

Preference Bias

Define a metric for comparing fs so as to determine whether one is better than another

New cards
5

upside down

A decision tree is drawn ______ with its root at the top

New cards
6

condition or internal node

bold text in black of a decision tree represents a ____

New cards
7

branches or edges

tree splits into _____

New cards
8

decision or leaf

The end of the branch that doesn’t split anymore is the _____

New cards
9

Random

Select any attribute at random

New cards
10

Least-Values

Choose the attribute with the smallest number of possible values

New cards
11

Most-Values

Choose the attribute with the largest number of possible values

New cards
12

Max-Gain

Choose the attribute that has the largest expected information gain

New cards
13

Max-Gain

try to select the attribute that will result in the smallest expected size of the subtrees rooted at its children

New cards
14

H

measures the information content or entropy in bits

New cards
15

Low information content

is desirable in order to make the smallest tree because low information content means that most of examples are classified the SAME, and therefore we would expect that the rest of the tree rooted at this node will be quite small to differentiate between the two classifications.

New cards
16

Conditional entropy

is defined as a conditional probability of a class, Y, given a value, v, for an attribute (i.e., question), X.

New cards
17

question

Pr(Y|X=v) what is X?

New cards
18

label

Pr(Y|X=v) what is Y?

New cards
19

answer to the question

Pr(Y|X=v) what is v?

New cards
20

symmetric

information gain is ________

New cards
21

mutual information

other term for information gain

New cards
22

entropy

measurement of uncertainty

New cards
23

Machine Learning

Is said as a subset of artificial intelligence

New cards
24

data and past experiences

Machine learning is the development of algorithms which allow a computer to learn from the ______ and _______ on their own

New cards
25

Arthur Samuel

Machine Learning was introduced by

New cards
26

1959

what year was Machine Learning introduced?

New cards
27

patterns

Machine learning uses data to detect various ______ in a given dataset.

New cards
28

automatically

It can learn from past data and improve ____________.

New cards
29

data-driven

It is a _________ technology.

New cards
30

data mining

Machine learning is much similar to _______ as it also deals with the huge amount of the data.

New cards
31

increment

Need for Machine Learning Rapid _______ in the production of data

New cards
32

complex

Need for Machine Learning Solving ______ problems, which are difficult for a human

New cards
33

Decision

Need for Machine Learning ______-making in various sector including finance

New cards
34

hidden

Need for Machine Learning Finding ______ patterns and extracting useful information from data

New cards
35

Supervised Learning

Classification of Machine learning Classification/Regression/Estimation

New cards
36

Unsupervised Learning

Classification of Machine learning Clustering/Prediction/Association

New cards
37

Reinforcement Learning

Classification of Machine learning Classification/Control/Decision-Making

New cards
38

Data Exploration

The step where we understand the nature of data that we have to work with. In this, we find Correlations, general trends, and outliers.

New cards
39

Data pre-processing

The step where preprocessing of data for its analysis takes place

New cards
40

Train

_____ model: to improve its performance for better outcome of the problem

New cards
41

Test

_____ model: check for the accuracy of our model by providing a test dataset to it.

New cards
42

Deployment

deploy the model in the real-world system

New cards
43

Herbert Simon

“Learning is any process by which a system improves performance from experience.” Who said dis?

New cards
44

Tom Mitchell

"A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E." Who said dis?

New cards
45

Learning

is essential for unknown environments

New cards
46

system construction

Learning is useful as a _________ method

New cards
47

omniscience

when designer lacks _______

New cards
48

reality

expose the agent to ____ rather than trying to write it down

New cards
49

decision mechanisms

Learning modifies the agent's _______ to improve performance

New cards
50

Machine learning

how to acquire a model on the basis of data / experience

New cards
51

probabilities

example of Learning parameters (plural)

New cards
52

Bayesian network graph

example of Learning structure (singular)

New cards
53

clustering

example of Learning hidden concepts

New cards
54

Supervised Learning

Machine Learning Areas Data and corresponding labels are given

New cards
55

Unsupervised Learning

Machine Learning Areas Only data is given, no labels provided

New cards
56

Semi-Supervised Learning

Machine Learning Areas Some (if not all) labels are present

New cards
57

Reinforcement Learning

Machine Learning Areas An agent interacting with the world makes observations, takes actions, and is rewarded or punished; it should learn to choose actions in such a way as to obtain a lot of reward

New cards
58

past experiences of Data feed in

A machine is said to be learning from ____________ with respect to some class of tasks if its Performance in a given Task improves with the Experience

New cards
59

previous knowledge or past experiences

the machine works in a basic conceptual level of looking at the ________

New cards
60

Data

labeled instances <xi, y>, e.g. emails marked spam/not spam
New cards
61

Features

attribute-value pairs which characterize each x

New cards
62

Experimentation Cycle

-Learn parameters (e.g. model probabilities) on training set -(Tune hyper-parameters on held-out set) -Compute accuracy of test set -Very important: never “peek” at the test set

New cards
63

accuracy

fraction of instances predicted correctly

New cards
64

overfitting

fitting the training data very closely, but not generalizing well

New cards
65

Classification

Learning a discrete function: ________

New cards
66

Regression

Learning a continuous function: _________

New cards
67

discrete

Learning a ______ function: Classification (a SL task where output is having defined labels)

New cards
68

continuous

Learning a ______ function: Regression (a SL task output is having a continuous value)

New cards
69

Data Cleaning

Issues: Data Preparation Preprocess data in order to reduce noise and handle missing values

New cards
70

Relevance Analysis

Issues: Data Preparation Remove the irrelevant or redundant attributes

New cards
71

Data Transformation

Issues: Data Preparation -Generalize data to (higher concepts, discretization) -Normalize attribute values

New cards
72

Model construction

describing a set of predetermined classes

New cards
73

class label

Each tuple/sample is assumed to belong to a predefined class, as determined by the ___________

New cards
74

training set

The set of tuples used for model construction is _______

New cards
75

Model Usage

for classifying future or unknown objects

New cards
76

independent

Test set is _______ of training set, otherwise over-fitting will occur

New cards
77

classify

If the accuracy is acceptable, use the model to ______ data tuples whose class labels are not known

New cards
78

Inductive learning Task

Use particular facts to make more generalized conclusions

New cards
79

predictive

A _____ model based on a branching series of Boolean tests

New cards
80

one-stage

These smaller Boolean tests are less complex than a _____ classifier

New cards
81

measure

We first make a list of attributes that we can _______

New cards
82

discrete

These attributes of the decision tree (for now) must be ______

New cards
83

target attribute

We then choose a _______ that we want to predict

New cards
84

experience table

Then create an ____________ that lists what we have seen in the past

New cards
85

Ross Quinlan

Who developed the ID3 algorithm in 1975?

New cards
86

entropy

ID3 splits attributes based on their ______

New cards
87

entropy

________ is the measure of disinformation

New cards
88

minimized

Entropy is ______ when all values of the target attribute are the same

New cards
89

maximized

Entropy is _______ when there is an equal chance of all values for the target attribute (i.e. the result is random)

New cards
90

lowest

ID3 splits on attributes with the ______ entropy

New cards
91

pruning

There is another technique for reducing the number of attributes used in a tree

New cards
92

prepruning

we decide during the building process when to stop adding attributes

New cards
93

postpruning

waits until the full decision tree has built and then prunes the attributes

New cards
94

expected entropy

ID3 is not optimal because it uses ________ reduction, not actual reduction

New cards
95

errors propagating

Decision trees suffer from a problem of ________ throughout a tree

New cards
96

discretization

We can use a technique known as discretization where We choose cut points for splitting continuous attributes

New cards
97

boundary point

where two adjacent instances in a sorted list have different target value attributes

New cards
98

Lionhead Studios

Black & White was developed by _____ that used ID3

New cards
99

Black & White

Used to predict a player’s reaction to a certain creature’s action

New cards

Explore top notes

note Note
studied byStudied by 6 people
... ago
5.0(1)
note Note
studied byStudied by 214 people
... ago
5.0(1)
note Note
studied byStudied by 70 people
... ago
5.0(1)
note Note
studied byStudied by 13 people
... ago
5.0(2)
note Note
studied byStudied by 19 people
... ago
5.0(1)
note Note
studied byStudied by 16 people
... ago
4.5(2)
note Note
studied byStudied by 1536 people
... ago
5.0(6)
note Note
studied byStudied by 8794 people
... ago
4.7(46)

Explore top flashcards

flashcards Flashcard (175)
studied byStudied by 9 people
... ago
5.0(1)
flashcards Flashcard (59)
studied byStudied by 7 people
... ago
5.0(1)
flashcards Flashcard (46)
studied byStudied by 34 people
... ago
5.0(1)
flashcards Flashcard (72)
studied byStudied by 34 people
... ago
5.0(1)
flashcards Flashcard (39)
studied byStudied by 3 people
... ago
5.0(1)
flashcards Flashcard (31)
studied byStudied by 4 people
... ago
5.0(1)
flashcards Flashcard (44)
studied byStudied by 22 people
... ago
5.0(1)
flashcards Flashcard (47)
studied byStudied by 4 people
... ago
5.0(1)
robot