TransWikia.com

Log-likelihood of Normal Distribution: Why the term $frac{n}{2}log(2pi sigma^2)$ is not considered in the minimization of SSE?

Cross Validated Asked by Javier TG on November 29, 2020

Reading some books and papers like the great one: ”Bundle Adjustment – A Modern Synthesis” (page 10), I found that the cost function weigthed Sum of Squared Error (SSE):

$SSE = frac{1}{2} sum_i Delta z_i(x)^T,W_i,Delta z_i(x)$ $,,,,,,,,,$(respecting the notation from the article linked above)

Represents as well the negative log-likelihood of the Normal Distribution from where the ground-truth data was obtained (considering that $W_i$ aproximates the inverse of the covariance matrix). Thereby, minimizing $SSE$, we will obtaine the parameters $x$ that best fit this Normal Distribution.

However, looking at some posts like this one form Wikipedia, they state that the log-likelihood for the Normal Distribution is given by:

$log(mathcal{L}(mu,sigma))= -frac{n}{2},log(2pisigma^2)-frac{1}{2sigma^2}sum_{i=1}^n(x_i-mu)^2$

So, Why the term $frac{n}{2},log(2pisigma^2)$ is not considered in the previous reasoning of minimizing $SSE$ = maximizing the likelihood?

Thanks in advance!

One Answer

Because that part of the log likelihood is constant (with respect to $mu$). Leaving it out saves some computation, but does not affect the ML estimate.

If you are also estimating $sigma$ then you would need to include that part as well.

Correct answer by Greg Snow on November 29, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP