IS 170 Decision Tree & Random Forest

0.0(0)
Studied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/18

encourage image

There's no tags or description

Looks like no tags are added yet.

Last updated 12:18 AM on 4/12/26
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

19 Terms

1
New cards

Decision Tree

A flowchart-like model that splits data into smaller groups repeatedly until it can’t split anymore

2
New cards

Root Node

first split (most important feature)

3
New cards

Decision Nodes

intermediate splits

4
New cards

Branches

outcomes of decisions

5
New cards

Leaf Nodes

final prediction

6
New cards

Gini Impurity

Measures how “mixed” a node is

7
New cards

Gini Value: 0

pure (all same class)

8
New cards

Gini Value (high: ~0.5)

mixed

9
New cards

Lower Gini

= better split

10
New cards

Root node

feature with lowest Gini

11
New cards

Key idea of decision tree:

Decision tree chooses the split with the lowest impurity

12
New cards

Concept of overfitting for decision tree:

  • Gini decreases (looks better)

  • causes overfitting to increase (very specific to dataset)

13
New cards

concept behind gini 0

  • not always good

  • performs poorly on new data

14
New cards

max_depth

Limits how deep tree grows

15
New cards

min_samples_leaf

Minimum data points in a leaf

16
New cards

Applications of Decision Tree

  • Fraud detection

  • Customer churn

  • Risk analysis

  • Employee retention

17
New cards

Random Forest

Combines multiple decision trees to improve performance

18
New cards

Decision Trees use: _____ ; Random Forest uses: _____

decision trees use all features ; random forest use: random subset of features

19
New cards