A.I. - Support Vector Machines (SVM)

0.0(0)
studied byStudied by 0 people
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/15

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

16 Terms

1
New cards

Support vector machines (SVM), offer another example of ____.

competitive learning

2
New cards

In the support vector approach, statistical measures are used to __(1)__ (the __(2)__) that maximally separate the positive and negative instances of a learned concept.

1) determine a minimum set of data points

2) support vectors

3
New cards

These support vectors, representing selected data points from both the positive and negative instances of the concept, implicitly define a ____ separating these two data sets.

hyperplane

4
New cards

Once the support vectors are learned __(1)__, the support vectors alone are __(2)__.

1) other data points need no longer be retained

2) sufficient to determine the separating hyperplane

5
New cards

The support vector machine is a __(1)__ where the learning of the support vectors is __(2)__.

1) linear classifier

2) supervised

6
New cards

The data for SVM learning is assumed to be produced __(1)__ and __(2)__ from a __(3)__, although unknown, distribution of data.

1) independently

2) identically

3) fixed

7
New cards

The hyperplane, implicitly defined by the support vectors themselves, divides the___.

positive from the negative data instances

8
New cards

Data points nearest the hyperplane are in the ____.

decision margin

9
New cards

Any addition or removal of a support vector changes the ____.

hyperplane boundary

10
New cards

After training is complete, it is possible to __(1)__ and __(2)__ from the support vectors alone.

1) reconstruct the hyperplane

2) classify new data sets

11
New cards

The SVM algorithm classifies data elements by ____.

computing the distance of a data point from the separating hyperplane as an optimization problem

12
New cards

For this probably approximately correct generalization task, ___ or ____ are employed.

Bayesian or other data compression techniques are employed

13
New cards

SVMs compute __(1)__ to determine data element classification. These decision rules created by the SVM represent __(2)__.

1) distances

2) statistical regularities in the data

14
New cards

The SVM, alternatively, attempts to __(1)__ and is more robust in its ability to handle poor separation caused by __(2)__.

1) maximize the decision margin

2) overlapping data points

15
New cards

SVMs may be generalized from two category classification problems to the discrimination of multiple classes by __(1)__ on __(2)__ of interest against __(3)__.

1) repeatedly running the SVM

2) each category

3) all the other categories

16
New cards

SVMs are best suited to problems with __(1)__ rather than __(2)__; as a result their applicability for many classic __(3)__ problems with qualitative boundaries is limited.

1) numerical data

2) categorical

3) categorization