Which of the following methods do we use to find the best fit line for data in Linear Regression?

Least Square Error
Maximum Likelihood
Logarithmic Loss
Both A and B

The correct answer is: Both A and B.

Least Square Error (LSE) is a method of finding the best fit line for data in Linear Regression. It minimizes the sum of the squared distances between the data points and the line.

Maximum Likelihood (ML) is another method of finding the best fit line for data in Linear Regression. It maximizes the likelihood of the data given the line.

Both LSE and ML are used to find the best fit line for data in Linear Regression. However, they are different methods with different properties. LSE is more robust to outliers, while ML is more efficient.

In addition to LSE and ML, there are other methods that can be used to find the best fit line for data in Linear Regression. These include:

  • Ridge Regression: This method penalizes the line for being too complex, which can help to prevent overfitting.
  • LASSO Regression: This method shrinks the coefficients of the line towards zero, which can help to reduce the number of predictors that are needed.
  • Elastic Net Regression: This method is a combination of Ridge Regression and LASSO Regression.

The choice of which method to use depends on the specific data set and the desired results.

Exit mobile version