I am curious about the entropy of a multivariate gaussian seen here: https://gregorygundersen.com/blog/2020/09/01/gaussian-entropy/
$$ H(\mathbf{x}) = \frac{D}{2}(1+\log(2\pi))+\frac{1}{2}\log |\Sigma| $$
Maybe you can clear up my confusion. I'm trying to build intuition about entropy. I learned about entropy using discrete distributions so I'm scratching my head on continuous. Suppose I have a single variable discrete probability distribution - let's say a coin flip. If it's unfair where it's always coming heads, it makes sense that entropy is zero as no information is needed as the coin always comes up head. Suppose it's fair, the entropy is 1 bit - in fact the intuition I'm building is that the more the discrete distribution is spread out the more entropy the distribution has.
Enter the multivariate gaussian. It's pretty easy to contrive a covariance matrix where the determinant is substantially smaller than one. Wouldn't that mean the entropy can go negative? What does that even mean?
$$ 0 = \frac{D}{2}(1+\log 2\pi) + \frac{1}{2} \log |\Sigma_0| $$
$$ -\frac{D}{2}(1+\log 2\pi) = \frac{1}{2} \log |\Sigma_0| $$
$$ \exp(-D (1+\log 2\pi)) = |\Sigma_0| $$ $|\Sigma_0|$ is the sigma to get me a zero entropy. Anything smaller, I'm into negative entropy and anything greater my entropy is increasing. Or is there no intuition and I just have to accept that continuous distributions can come up negative. That is, just think of tighter covariances as "more ordered" and ignore the significance of "zero".
"Like its discrete counterpart, differential entropy measures the randomness of a random variable and the number of bits required to describe it. The difference is that the description is not exact. Rather, it can be thought of as describing the variable to within an interval of length one. For example, pinning a uniform [0, a] random variable down to an interval of length one requires log a bits. In particular, when a < 1, a “negative” number of bits is required, explaining why differential entropy can be negative."
– Andreas Lenz Apr 10 '23 at 20:23