Which of the following parameters can be tuned for finding good ensemble model in bagging based algorithms? 1. Max number of samples 2. Max features 3. Bootstrapping of samples 4. Bootstrapping of features

1 and 3
2 and 3
1 and 2
all of above

The correct answer is D. all of above.

Bagging is a machine learning ensemble meta-algorithm that builds a number of different models, called “base learners”, on different subsets of the original dataset. The predictions of the base learners are then combined in a way to improve the overall accuracy of the model.

The parameters that can be tuned for finding good ensemble model in bagging based algorithms are:

  • Max number of samples: This is the number of samples that will be used to train each base learner. A higher number of samples will generally lead to a more accurate model, but it will also take longer to train.
  • Max features: This is the number of features that will be used to train each base learner. A higher number of features will generally lead to a more accurate model, but it will also take longer to train.
  • Bootstrapping of samples: This is the process of sampling with replacement from the original dataset to create the training sets for the base learners. Bootstrapping helps to reduce the variance of the model by averaging the predictions of the base learners.
  • Bootstrapping of features: This is the process of sampling with replacement from the set of features to create the training sets for the base learners. Bootstrapping of features helps to reduce the variance of the model by averaging the predictions of the base learners on different subsets of the features.

In general, it is important to experiment with different values of these parameters to find the best combination for the specific dataset and problem at hand.

Exit mobile version