2

For $1/2<\alpha\le 1$ show that $$\sum_{i=\lceil\alpha n\rceil}^n {n\choose i}\le 2^{nH(\alpha)}$$ where $H(\alpha)=-\alpha\log_{2}\alpha - (1-\alpha)\log_2 (1-\alpha)$ is the entropy.

I'm at a loss for this one, I can see that eqaulity occurs when $\alpha=1$ (both sides $=1$). I also see that both sides increase as $\alpha \downarrow 1/2$ and I think the point is RHS is increasing at a faster rate. The change in LHS is discontinuous, only increasing whenever $\alpha n$ crosses an integer point, and of course if I can show that RHS is greater than LHS at those integer points then inequality is given. But I have no idea how I would end up getting some expressions like $H(\alpha)$.

Also, eventually the RHS converge to $2^n$, which I know is equal to sum of all binomial coefficients, so in some sense this upper bound is not so sharp.

Any helps appreciated!

user160738
  • 4,230
  • 16
  • 23
  • In https://en.wikipedia.org/wiki/Binomial_distribution#Tail_Bounds you will find for this type of bound the reference to R. Arratia and L. Gordon: Tutorial on large deviations for the binomial distribution, Bulletin of Mathematical Biology 51(1) (1989), 125–131 – Lutz Lehmann Jan 20 '16 at 09:42
  • Related? https://math.stackexchange.com/questions/1548940/sharper-lower-bounds-for-binomial-chernoff-tails – Lutz Lehmann Jan 20 '16 at 09:44

1 Answers1

2

I will use the properties of entropy to address this question.

Let's take logarithm on both sides:

$$\log_2 \sum_{i=\lceil \alpha n \rceil}^n \binom{n} i \leq nH(\alpha).$$

Let's construct a meaning to the term on the left hand side of the equation above. Let $A$ denotes the set of all subsets of $\left\{ 1, \ldots, n \right\}$ of cardinality at least $\lceil \alpha n \rceil$. Consider the random variable $X$ to be the uniform distribution over $A$, the term on the left is just $H(X)$.

Hence, I have reduced the problem to $H(X) \leq nH(\alpha)$ where $X$ is defined in the paragraph above.

Next, let's view $X$ as joint distribution $(X_1,X_2,\ldots, X_n)$ where $X_i=1$ if it is in the selected subset and 0 otherwise. From subadditivity property of entropy, we know that $H(X_1,\ldots, X_n) \leq \sum_{i=1}^n H(X_i)$.

Hence, it suffices to prove that $$\sum_{i=1}^n H(X_i) \leq nH(\alpha).$$

Now, by symmetry, the $X_i$ are identical distributions. The inequality further reduced to $H(X_i) \leq H(\alpha)$ and at this moment, we are just dealing with binary entropy.

Let $p=\mathrm{Pr}(X_i=1)$, from definition of $A$, we know that $p \geq \alpha$. Since we are given that $\alpha \geq \frac{1}{2}$, $$H(X_i)=H(p) \leq H(\alpha).$$

Siong Thye Goh
  • 153,832