11 - Ensemble Models

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/40

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 12:18 PM on 4/2/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

41 Terms

1
New cards

Ensemble Models

Using multiple models that ‘work’ together

2
New cards
3
New cards

Bootstrap Aggregation

Also known as bagging

4
New cards

Bootstrap Aggregation

Aims to address the problem of overfitting on the training data

5
New cards

Bootstrap Aggregation

Often applied to decision trees (notorious for overfitting)

6
New cards

Bootstrap Sampling

Step 1

7
New cards

Parallel Training

Step 2

8
New cards

Aggregation

Step 3

9
New cards

Bootstrap Sampling

Generate multiple random subsets of the training data

10
New cards

Parallel Training

Train an independent base model on each bootstrap sample

11
New cards

Aggregation

Combine predictions of the individual models through average (regression) or majority voting (classification)

12
New cards

Random Forest

Bagging model for decision trees

13
New cards

Overfit

Random Forest: Individual decision trees tend to _____

14
New cards

Regularization effect

Random Forest: Averaging multiple overfit trees has a _____

15
New cards

Boosting

Ensemble learning technique where many ‘weak’ models are trained sequentially

16
New cards

Boosting

Start with a simple, weak model, then make another model to learn from the previous model’s mistakes, and so on

17
New cards

AdaBoost

Boosting algorithm for decision trees

18
New cards

AdaBoost

Uses many ‘weak’ decision trees trained sequentially to learn from the previous model’s mistakes

19
New cards

Decision Stump

A decision tree with only one question (i.e., depth= 1)

20
New cards

Decision Stump

Because it can only ask one question, it is not capable of learning complex patterns

21
New cards

AdaBoost

Trains several decision stumps in sequence, where each stump focuses on the weakness of the previous stump

22
New cards

AdaBoost

Originally designed for classification, but there are variants for regression

23
New cards

Consensus

AdaBoost: Final prediction is based on the _____ of the different stumps

24
New cards

Influence

Some stumps have a bigger _____ on the prediction than others

25
New cards

Equal

AdaBoost General Algorithm:

  1. Assign _____ weights to all training instances

26
New cards

Best

AdaBoost General Algorithm:

  1. Choose the _____ stump according to the weights

27
New cards

Amount of say

AdaBoost General Algorithm:

  1. Compute the _____ of the stump

28
New cards

Weights

AdaBoost General Algorithm:

  1. Adjust the _____

29
New cards

Step 2

AdaBoost General Algorithm:

  1. Repeat _____ for the next stump

30
New cards

Weights

Represents the “attention” that should be given to each instances

31
New cards

Misclassified

Weights: Instances that are _____ should be given more attention

32
New cards

Amount of say

Represents the “voting power” of each stump, based on how well it distinguishes between the classes

33
New cards

Gradient Boosting

Boosting algorithm for regression where a sequence of shallow trees are trained on the combined residuals (errors) of all previous trees

34
New cards

Total prediction

Gradient Boosting: The final prediction is the _____ of all the trees

35
New cards

Stacked Models

Ensemble learning technique where you train multiple diverse models (called base learners), then feed their predictions as inputs to a meta-learner that learns how to best combine them

36
New cards

Stacking

Ensemble Learning: Creating multiple models

37
New cards

Stacking

Ensemble Learning: Create another model that accepts their predictions and improves of them

38
New cards

Bagging

Ensemble Learning: Use high variance models

39
New cards

Bagging

Ensemble Learning: Lower the variance by getting the average

40
New cards

Boosting

Ensemble Learning: Use high bias models

41
New cards

Boosting

Ensemble Learning: Lower the bias by stringing them together and learning from past mistakes

Explore top flashcards

flashcards
Anatomy Exam 3 Quizzes
33
Updated 1218d ago
0.0(0)
flashcards
MGMT 3000 - Midterm
129
Updated 395d ago
0.0(0)
flashcards
history tudors AQA
430
Updated 1229d ago
0.0(0)
flashcards
Unit 1
110
Updated 1151d ago
0.0(0)
flashcards
Anatomy Exam 3 Quizzes
33
Updated 1218d ago
0.0(0)
flashcards
MGMT 3000 - Midterm
129
Updated 395d ago
0.0(0)
flashcards
history tudors AQA
430
Updated 1229d ago
0.0(0)
flashcards
Unit 1
110
Updated 1151d ago
0.0(0)