1

I need to prove that $ H\left(X|Y\right)=H\left(X\right)\Rightarrow X\perp Y $ where $X\perp Y $ means they are independent. $$ H\left(X|Y\right)=-\sum_{x}\sum_{y}P\left(x,y\right)\log_{2}P\left(x|y\right) $$ $$ H\left(X\right)=-\sum_{x}P\left(x\right)\log_{2}P\left(x\right) $$

How do I prove it? the other direction is easy but I can't seem to find how to do this one.

I know that $$ H\left(X|Y\right)=H\left(X,Y\right)-H\left(Y\right) $$ But It does not seem to help me progress anywhere.

1 Answers1

2

This depends on what you admit as known. A standard path is

  1. Prove the log sum inequality (using Jensen inequality)

$$\sum_i a_{i}\log\frac{a_{i}}{b_{i}} \ge A \log \frac{A}{B}$$

for any $a_i, b_i >0$, where $A= \sum_i a_i$, $B= \sum_i b_i$. We have equality iff $a_i = b_i$.

  1. Using the above, show that the KL distance between two distributions $D(p||q) = \sum_i p_i \log(p_i/q_i)$ is non-negative, and it's zero iff the distributions are identical.

  2. Show that the mutual information equals the KL distance between the joint distribution and the product of the marginals: $$I(X;Y) = H(X)-H(X|Y) = D( p(X,Y)|| p(X) p(Y))$$

  3. Show that $I(X;Y) \ge 0$, with equality iff $p(X,Y)=p(X) p(Y)$ (i.e. , they are independent).

  4. Finally conclude that $H(X) \ge H(X|Y)$, with equality iff $X,Y$ are independent.

leonbloy
  • 66,202