Impact of high variance on the training set ?

overfitting
underfitting
both underfitting & overfitting
depents upon the dataset

The correct answer is: A. overfitting

High variance in the training set can lead to overfitting, which is when a model learns the training data too well and does not generalize well to new data. This can happen when the training set is too small or when there is a lot of noise in the data.

Overfitting can be avoided by using a larger training set, by using a regularization technique, or by using a more complex model.


Option B, underfitting, is when a model does not learn the training data well enough and does not generalize well to new data. This can happen when the training set is too large or when the model is too simple.

Underfitting can be avoided by using a smaller training set, by using a less complex model, or by using a regularization technique.


Option C, both underfitting and overfitting, is not correct. A model can only overfit or underfit, not both.


Option D, depends upon the dataset, is not correct. High variance in the training set can lead to overfitting, regardless of the dataset.

Exit mobile version