I am trying to derive the entropy in normal distribution. Let $p(x)$ to be the probability density function of uniform normal distribution \begin{equation} p(x) = \frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{x^2}{2\sigma^2}} \end{equation} hence, by using integration by parts, we have \begin{equation} \int^{\infty}_{-\infty} x^2p(x) dx = x^2 \int^{\infty}_{-\infty} p(x) dx - \int^{\infty}_{-\infty} 2x \left(\int^{\infty}_{-\infty} p(x) dx\right) dx \end{equation} Because \begin{equation} \int^{\infty}_{-\infty} p(x) dx = 100\% \end{equation} we have \begin{equation} \int^{\infty}_{-\infty} x^2p(x) dx = x^2 - x^2 + C = C \end{equation} However, lots of relevant proofs online says that \begin{equation} \int^{\infty}_{-\infty} x^2p(x) dx = \sigma^2 \end{equation} Does anyone know the reason?
-
The way you use integration by parts for definite integrals is not permissible. For example,the first line after integration by parts makes no sense because the first term on the right is a function of $x$ and everything else is a constant. This method cannot be salvaged. – Kavi Rama Murthy Sep 05 '19 at 10:09
-
thanks, what should i do to solve this integration problem? – Stephen Ge Sep 05 '19 at 10:20
1 Answers
By definition, if $X$ has density $p(x)$ then $EX^2=\int_{-\infty}^\infty x^2p(x)\,dx$. So here we have $$\int_{-\infty}^\infty x^2p(x)\,dx=EX^2=Var(X)+(EX)^2=Var(X)=\sigma^2.$$
If you really need to use integration method, here is one. \begin{align*} \int_{-\infty}^\infty x^2p(x)\,dx&=\frac1{\sqrt{2\pi}\sigma}\int_{-\infty}^\infty x^2e^{-\frac{x^2}{2\sigma^2}}\,dx\\ &=\frac1{\sqrt{2\pi}\sigma}\int_{-\infty}^\infty -\sigma^2x\,d\left(e^{-\frac{x^2}{2\sigma^2}}\right)\\ &=\frac1{\sqrt{2\pi}\sigma}\left(-\sigma^2xe^{-\frac{x^2}{2\sigma^2}}\mid_{-\infty}^\infty+\sigma^2 \int_{-\infty}^\infty e^{-\frac{x^2}{2\sigma^2}}\,dx\right)\\ &=\sigma^2 \int_{-\infty}^\infty \frac1{\sqrt{2\pi}\sigma}e^{-\frac{x^2}{2\sigma^2}}\,dx\\ &=\sigma^2 \int_{-\infty}^\infty p(x)\,dx\\ &=\sigma^2. \end{align*}
Finally, the entropy. \begin{align*} H&=-\int_{-\infty}^\infty p(x)\log p(x)\,dx\\ &=\frac1{2\sigma^2}\int_{-\infty}^\infty x^2p(x)\,dx+\log(\sqrt{2\pi}\sigma)\int_{-\infty}^\infty p(x)\,dx\\ &=\frac12+\log(\sqrt{2\pi}\sigma)\\ &=\log(\sqrt{2\pi e}\sigma). \end{align*}
- 13,920
-
Many thanks Feng! Do you mind to have a look the Multivariate version of this problem? https://math.stackexchange.com/questions/3346990/entropy-maximization-in-the-multivariate-normal-distribution – Stephen Ge Sep 07 '19 at 07:02
-
@StephenGe I'm sorry but it had already been closed when I looked into that question. I can find a proof for that one only using multivariable calculus. If you really want to know, you can leave an e-mail here. – Feng Sep 07 '19 at 13:19
-
Thank you very much Feng! that will be very helpful! stephen.ge@hotmail.com – Stephen Ge Sep 07 '19 at 16:34