Reviewer for Data Mining (MIDTERM)

0.0(0)
studied byStudied by 0 people
full-widthCall with Kai
GameKnowt Play
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/66

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

67 Terms

1
New cards

Regression

A supervised learning technique used to predict continuous numerical values based on one or more input variables.

2
New cards

Dependent Variable (Response)

The value you want to predict in a regression model.

3
New cards

Independent Variable (Predictor/Features)

The variable(s) used to make predictions in regression.

4
New cards

Goal of Regression

To find the best

5
New cards

Regression Applications

Finance, healthcare, marketing, manufacturing, retail.

6
New cards

Continuous Target Variable

Regression predicts a continuous value like sales price or height.

7
New cards

Mean Squared Error (MSE)

The average of the squares of the errors, a common regression metric.

8
New cards

Root Mean Squared Error (RMSE)

The square root of MSE, measures average prediction error in same units as target.

9
New cards

Overfitting

When a model is too complex and learns noise from training data (poor generalization).

10
New cards

Underfitting

When a model is too simple and misses key data patterns.

11
New cards

Interpretability

Regression coefficients show how much each predictor affects the target.

12
New cards

Predictor Variable (Feature)

Input used for prediction in regression.

13
New cards

Response Variable

Output to be predicted.

14
New cards

Coefficient

Represents the change in the response variable for a one

15
New cards

Residuals

The differences between observed and predicted values.

16
New cards

Multicollinearity

Situation where predictors are highly correlated, which may affect coefficient stability.

17
New cards

Outliers

Data points that deviate substantially and may distort the regression model.

18
New cards

Simple Regression

Regression with one predictor and one response variable.

19
New cards

Multiple Regression

Regression using two or more predictors for a single response variable.

20
New cards

Nonlinear Regression

Regression capturing nonlinear relationships, e.g. plant growth over time.

21
New cards

Simple Linear Regression Formula

y = β0 + β1 x

22
New cards

Multiple Linear Regression Formula

y = β0 + β1X1 + β2X2 + … + βpXp + ε

23
New cards

Linear Regression

Algorithm fitting a straight line to predict outcomes.

24
New cards

Polynomial Regression

Algorithm fitting a curve, capturing nonlinear relationships.

25
New cards

Ridge Regression

Regularization method preventing overfitting by shrinking coefficients.

26
New cards

Lasso Regression

Regularization method that can force some coefficients to exactly zero for feature selection.

27
New cards

Decision Tree Regression

Uses tree structures, can handle nonlinear relationships.

28
New cards

Random Forest Regression

Uses ensemble of trees for robust, less overfitted predictions.

29
New cards

Support Vector Regression (SVR)

Uses hyperplanes in high

30
New cards

Advantages of Regression

Interpretable, good for forecasting, reveals feature importance, flexible.

31
New cards

Disadvantages of Regression

Often assumes linearity, can overfit, sensitive to outliers.

32
New cards

Mean Absolute Error (MAE)

Average of absolute errors between predicted and actual, lower is better.

33
New cards

Mean Squared Error (MSE)

Average squared error, penalizes large errors, lower is better.

34
New cards

Root Mean Squared Error (RMSE)

Sqrt of MSE, interpretable in target units, lower is better.

35
New cards

R

squared (R2)

36
New cards

Adjusted R-squared

squared

37
New cards

Classification

Categorizes data points into defined classes based on features.

38
New cards

Binary Classification

Classification with two possible outcomes (e.g., spam/not spam).

39
New cards

Multiclass Classification

Classification with more than two possible labels.

40
New cards

Classification Applications

Credit risk analysis, shopping prediction, medical diagnosis, sentiment analysis.

41
New cards

Supervised Classification

Trained using labeled data (target classes known).

42
New cards

Unsupervised Classification

Discovers classes from unlabeled data (e.g., clustering).

43
New cards

Training Phase (Classification)

Model learns from labeled data.

44
New cards

Testing Phase (Classification)

Model is validated on new/unseen data for accuracy.

45
New cards

Dataset Split (Classification)

Split into training (60

46
New cards

Decision Tree Classifier

Uses tree structure; splits data on features to classify.

47
New cards

Decision Tree Algorithms

ID3 and C4.5 use information gain and Gini index for splits.

48
New cards

Advantage of Decision Trees

Easy to interpret and use.

49
New cards

Disadvantage of Decision Trees

Can be sensitive to small changes, may be inaccurate or complex.

50
New cards

Overfitting in Trees

Complex trees may overfit to training data.

51
New cards

Pruning

Removes unnecessary branches to improve prediction on unseen data.

52
New cards

Information Theory

Quantifies information and measures uncertainty, crucial in machine learning.

53
New cards

Entropy (H)

Measures randomness/uncertainty or impurity in a dataset.

54
New cards

High Entropy

More uncertainty; labels are mixed.

55
New cards

Low Entropy

More certainty; labels are pure.

56
New cards

Entropy Formula

H(S) =

57
New cards

Information Gain (IG)

How much a feature reduces entropy when splitting data.

58
New cards

Purpose of Information Gain

Select the best attribute for decision

59
New cards

Information Gain Formula

IG(S, A) = H(S)

60
New cards

Confusion Matrix

Shows count of actual vs. predicted classes in classification.

61
New cards

True Positive (TP)

Correctly predicted as positive.

62
New cards

True Negative (TN)

Correctly predicted as negative.

63
New cards

False Positive (FP)

Incorrectly predicted as positive.

64
New cards

False Negative (FN)

Incorrectly predicted as negative.

65
New cards

Precision

TP/(TP + FP): Fraction of predicted positives that are actual positives.

66
New cards

Recall

TP/(TP + FN): Fraction of actual positives correctly found.

67
New cards

F1 Score

Harmonic mean of Precision & Recall: 2(PrecisionRecall)/(Precision+Recall)