1/34
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
The larger variety of data points your data set contains, the more complex a model you can use without overfitting. (T or F)
True
Binary Classification is a classification of dichotomous classes. (T or F)
True
The _____ allows the models to make informed predictions even when faced with previously unseen data.
Underfitting
Generalization
Overfitting
Generalization
Supervised algorithms address classification problems where the output variable is categorical. (T or F)
False
The ______ refers to the error resulting from sensitivity to the noise in the training data.
variance
The more complex we allow our model to be, the better we will be able to predict on the training data. (T or F)
True
SVM is an example of regression algorithm. (T or F)
False
In k-NN, High Model Complexity is overfitting. (T or F)
True
In k-NN, when you choose a small value of k (e.g., k=1), the model becomes less complex. (T or F)
False
The ‘k’ in k-Nearest neighbors refers to an arbitrary number of neighbors. (T or F)
True
In k-NN, voting means for each test point, we count how many neighbors belong to a class e.g. how many belong to class 0 and how many neighbors belong to class 1. (T or F)
True
In the estimation of regression model, predicting worse than the average can result in negative numbers. (T or F)
True
When comparing training set and test set scores, we find that we predict very accurately on the training set, but the R2 on the test set is much worse. This is a sign of overfitting. (T or F)
True
Lasso uses L2 Regularization. (T or F)
False
What is the full form of OLS?
Ordinary Least Squares
Regularization means explicitly restricting a model to avoid overfitting. (T or F)
True
Ridge is generally preferred over Lasso, but if you want a model that is easy to analyze and understand then use Ridge. (T or F)
False
In Ridge regression is α (alpha) is larger, the penalty becomes larger. (T or F)
True
Naive Bayes classifier assumes that the presence of a particular feature in a class is unrelated to the presence of any other feature. (T or F)
True
Naïve Bayes classifier that deals with continuous data.
All the given options
GaussianNB
MultinomialNB
BernoulliNB
GaussianNB
Its target is a categorical variable.
Correlation
Supervised Learning
Regression
Classification
Classification
Regression predicts consecutive numbers. (T or F)
True
In k-NN, Low Model Complexity is underfitting. (T or F)
True
In k-NN, Low Model Complexity is overfitting. (T or F)
False
The ‘offset’ parameter is also called intercept. (T or F)
True
Types of Linear Models : Linear Regression, ____________.
Logistic Regression
The ‘slope’ parameter is also called weights or coefficients. (T or F)
True
Naïve Bayes classifier that deals with integer count data.
MultinomialNB
All the given options
BernoulliNB
GaussianNB
MultinomialNB
A model which does not capture the underlying relationship in the dataset on which it's trained.
Generalization
Overfitting
Underfitting
Underfitting
A model is able to make accurate predictions on new, unseen data.
Underfitting
Overfitting
Generalization
Generalization
When using multiple nearest neighbors, the prediction is the mean of the relevant neighbors. (T or F)
True
The ‘offset’ parameter is also called _______.
Intercept
Slope
Weights
Mean
Intercept
In Ridge regression is α (alpha) is larger, the penalty becomes lesser. (T or F)
False
Naïve Bayes classifier that deals with integer binary data.
BernoulliNB
GaussianNB
d. All the given options
MultinomialNB
BernoulliNB
Naïve Bayes learns parameters by looking at each feature individually and collects simple per-class statistics from each feature. (T or F)
True