Which of the following quantities are minimized directly or indirectly during parameter estimation in Gaussian distribution Model?

negative log-likelihood
log-liklihood
cross entropy
residual sum of square

The correct answer is A. negative log-likelihood.

The negative log-likelihood is a measure of how well a model fits the data. It is defined as follows:

$$-\log \mathcal{L}(\theta|x) = -\sum_{i=1}^n \log \phi(x_i|\theta)$$

where $\theta$ is the model parameters, $x$ is the data, and $\phi(x_i|\theta)$ is the probability density function of the Gaussian distribution with parameters $\theta$.

The negative log-likelihood is minimized when the model parameters are the true parameters of the data. Therefore, it is a natural choice for the objective function to minimize during parameter estimation in a Gaussian distribution model.

The other options are incorrect for the following reasons:

  • Option B, log-likelihood, is the opposite of the negative log-likelihood. It is not minimized during parameter estimation.
  • Option C, cross entropy, is a measure of how well two probability distributions are related. It is not used in parameter estimation for Gaussian distribution models.
  • Option D, residual sum of squares, is a measure of the error between the model predictions and the data. It is not minimized during parameter estimation.
Exit mobile version