1/9
These flashcards cover key concepts related to Naïve Bayes, decision trees, and ensemble classifiers in machine learning.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
Joint Probability
The probability of two or more events happening at the same time.
Bayes Theorem
A formula that describes the probability of an event based on prior knowledge of conditions that might be related to the event.
Naïve Bayes Assumption
All features are independent of each other.
Entropy
A measure of the impurity or randomness in a dataset.
Information Gain
A measure used to select the attribute that results in the smallest tree, calculated by the difference in entropy before and after a split.
Random Forest
An ensemble learning method that builds multiple decision trees and predicts the class by voting.
Class Conditional Probability
The probability of a feature given a class.
Prior Probability
The probability of a class occurring independently of the features.
Posterior Probability
The probability of a class given the features.
Ensemble Learning
A technique that combines multiple models to produce a better performing model.