Using the standard definition for $H[p]$, you can show that the entropy of a Normally distributed variable is $$ H(\mu, \sigma^2) = \frac{1}{2} \ln\left(2\pi\sigma^2\right) + \frac{1}{2} $$ in units of 'nats'. To convert this to bits, you divide by $\ln(2)$.
Interestingly, when $\sigma^2 = 1$, this is quite close to 2 bits. But what exactly does this mean? How could you compress the value of a real number to 2 bits? Worse yet, what does it mean that $H$ can be negative for very small values of $\sigma$?
No, that's the differential entropy . The (Shannon) entropy of a normal variable is infinite.
https://math.stackexchange.com/questions/2880612/comparing-differential-entropy-of-normal-distribution-to-shannon-entropy
https://math.stackexchange.com/questions/4904468/shannon-source-coding-theorem-and-differential-entropy/4905407
https://math.stackexchange.com/questions/1398438/differential-entropy/1398471
– leonbloy Aug 24 '24 at 03:28