The probability function for a mixture of multi-variate Gaussians looks like
where both and are vectors of dimension N, and i a count of the number of Gaussians in the mixture.
Everyone knows the above equation, but the problem comes when actually trying to compute it. So typically papers report this Log-Likelihood, which is basically the . Consider an example of 2 gaussians. Then, you may need to evaluate something like . Those are not made up. It is quite easy to obtain a score of 1000. Direct evaluation will usually return -Inf, but we clearly know that it cannot be true, and has to be around -999.something.
The simple thing to do is rewrite as
And that is the computational trick. Pull out the maximum, from the sum, leave the rest (which can be evaluated). Just FYI, the maximum value is exp(709) which lies just below the numerical Infinity.