Entropy is defined as $$H(x) = E_{x\sim p(x)}[ - \log p(x)]$$ and $- \log p(x) \geq 0$ so it makes sense that the expectation is always non-negative, here is a proof.
However, Wikipedia says the entropy of a normal distribution is $\tfrac{1}{2} \ln(e2\pi\sigma^2 $), which means that the entropy can be negative for some values e.g. $\sigma = 0.01$ but how can be this be the case and how can it be interpreted?