I am using Gaussian Mixture Model from python scikit-learn package to train my dataset , however , I fount that when I code
-- G=mixture.GMM(...)
-- G.fit(...)
-- G.score(sum feature)
the resulting log probability is positive real number... why is that? isn't log probability guaranteed to be negative?
I get it. what Gaussian Mixture Model returns to us i the log probability "density" instead of probability "mass" so positive value is totally reasonable.
If the covariance matrix is near to singular, then the GMM will not perfomr well, and generally it means the data is not good for such generative task