1

The method of types tells us that for $0/1$ random variables $X_1, \dots, X_n$ with $\text{Pr}[X_i = 1] = p$, for every $\epsilon > 0$, $$\text{Pr}[\sum_{i=1}^n X_i \ge (p + \epsilon) n] \ge \frac{2^{-D(p + \epsilon \| p)n}}{n+1}$$ where $D(x \| y)$ is the KL-divergence between Bernoulli variables with parameters $x$ and $y$.

Is there a tighter bound that gets rid of the $n+1$ in the denominator? That is, can one prove the following? $$\text{Pr}[\sum_{i=1}^n X_i \ge (p + \epsilon) n] \ge 2^{-D(p + \epsilon \| p)n}$$

I am particularly interested in the case of $p = 1/2$.

  • 2
    You might want to have a look at https://math.stackexchange.com/questions/1548940/sharper-lower-bounds-for-binomial-chernoff-tails. Getting matching upper and lower bounds is not an easy task, but basically when $p$ and $\epsilon$ are bounded away from 0 and 1, the correct order is $2^{-D(p+\epsilon||p)n}/\sqrt{n}$ – md5 Jun 02 '21 at 13:59

0 Answers0