Machine Learning (ML) MCQs | Page - 26

Dear candidates you will find MCQ questions of Machine Learning (ML) here. Learn these questions and prepare yourself for coming examinations and interviews. You can check the right answer of any question by clicking on any option or by clicking view answer button.

M

Mr. Dubey • 52.58K Points
Coach

Q. Suppose there are 25 base classifiers. Each classifier has error rates of e = 0.35.
Suppose you are using averaging as ensemble technique. What will be the probabilities that ensemble of above 25 classifiers will make a wrong prediction?
Note: All classifiers are independent of each other

(A) 0.05
(B) 0.06
(C) 0.07
(D) 0.09
View Answer Discuss Share

M

Mr. Dubey • 52.58K Points
Coach

Q. In machine learning, an algorithm (or learning algorithm) is said to be unstable if a small change in training data cause the large change in the learned classifiers. True or False: Bagging of unstable classifiers is a good idea

(A) true
(B) false
(C) ---
(D) ---
View Answer Discuss Share

M

Mr. Dubey • 52.58K Points
Coach

Q. Which of the following parameters can be tuned for finding good ensemble model in bagging based algorithms?
1. Max number of samples
2. Max features
3. Bootstrapping of samples
4. Bootstrapping of features

(A) 1 and 3
(B) 2 and 3
(C) 1 and 2
(D) all of above
View Answer Discuss Share

M

Mr. Dubey • 52.58K Points
Coach

Q. How is the model capacity affected with dropout rate (where model capacity means the ability of a neural network to approximate complex functions)?

(A) model capacity increases in increase in dropout rate
(B) model capacity decreases in increase in dropout rate
(C) model capacity is not affected on increase in dropout rate
(D) none of these
View Answer Discuss Share

M

Mr. Dubey • 52.58K Points
Coach

Q. True or False: Dropout is computationally expensive technique w.r.t. bagging

(A) true
(B) false
(C) ---
(D) ---
View Answer Discuss Share

M

Mr. Dubey • 52.58K Points
Coach

Q. Suppose, you want to apply a stepwise forward selection method for choosing the best models for an ensemble model. Which of the following is the correct order of the steps?
Note: You have more than 1000 models predictions
1. Add the models predictions (or in another term take the average) one by one in the ensemble which improves the metrics in the validation set.
2. Start with empty ensemble
3. Return the ensemble from the nested set of ensembles that has maximum performance on the validation set

(A) 1-2-3
(B) 1-3-4
(C) 2-1-3
(D) none of above
View Answer Discuss Share

M

Mr. Dubey • 52.58K Points
Coach

Q. Suppose, you have 2000 different models with their predictions and want to ensemble predictions of best x models. Now, which of the following can be a possible method to select the best x models for an ensemble?

(A) step wise forward selection
(B) step wise backward elimination
(C) both
(D) none of above
View Answer Discuss Share

M

Mr. Dubey • 52.58K Points
Coach

Q. Below are the two ensemble models:
1. E1(M1, M2, M3) and
2. E2(M4, M5, M6)
Above, Mx is the individual base models.
Which of the following are more likely to choose if following conditions for E1 and E2 are given?
E1: Individual Models accuracies are high but models are of the same type or in another term less diverse
E2: Individual Models accuracies are high but they are of different types in another term high diverse in nature

(A) e1
(B) e2
(C) any of e1 and e2
(D) none of these
View Answer Discuss Share

M

Mr. Dubey • 52.58K Points
Coach

Q. True or False: In boosting, individual base learners can be parallel.

(A) true
(B) false
(C) ---
(D) ---
View Answer Discuss Share

M

Mr. Dubey • 52.58K Points
Coach

Q. Which of the following is true about bagging?
1. Bagging can be parallel
2. The aim of bagging is to reduce bias not variance
3. Bagging helps in reducing overfitting

(A) 1 and 2
(B) 2 and 3
(C) 1 and 3
(D) all of these
View Answer Discuss Share

Jump to

Download our easy to use, user friendly Android App from Play Store. And learn MCQs with one click.

Image