2

This is a problem I encountered while studying High-Dimensional Statistics, it is about the Concentration inequality for the bounded random variables

The question is: Consider a sequence of independent random variables $(X_1,\cdots,X_n)$ that satisfy $0\leq X_k\leq b$ for each $k$. Let $Z=\sum_{k=1}^n X_k$. Show that \begin{align*} \mathbb{P}[Z\geq (1+t)\mathbb{E}[Z]] &\leq \left(\frac{e^t}{(1+t)^{(1+t)}}\right)^{\mathbb{E}[Z]/b} \ \text{for}\ t>0,\\ \mathbb{P}[Z\leq (1-t)\mathbb{E}[Z]] &\leq \left(\frac{e^t}{(1-t)^{(1-t)}}\right)^{\mathbb{E}[Z]/b} \ \text{for}\ t\in (0,1). \end{align*}

I tried to use Hoeffding's inequality to prove it. Since $0\leq X_k\leq b$ for each $k$, so $Z$ is $\frac{nb^2}{4}$-sub-Gaussian. By the Hoeffding's inequality, $\mathbb{P}[≥(1+)[]]\leq \exp(-\frac{2(1+t)^2 \mathbb{E}[Z]^2)}{nb^2})$.

The right-hand side of the inequality, however, is independent of , and the term $(1+t)^{1+t}$ appears, I don't know how to obtain this result.

Joker
  • 49
  • What inequalities do you know? Like, Azuma's inequality? Hoeffding's inequality? Chernoff? I don't think the inequality you have used (bounded case concentration) is good enough. – Sarvesh Ravichandran Iyer Apr 02 '24 at 06:22
  • I'm sorry I didn't make it clear which inequalities I know. In fact, I have tried to use Markov's inequality, Hoeffding's inequality, and inequalities about bounded random variables. However, I am not quite sure about the derivation of the term $(1+t)^{1+t}$. About the Chernoff inequality, since the problem did not specify that $X_1,\dots,X_n$ are independent and identically distributed, I did not consider using it. – Joker Apr 02 '24 at 06:43
  • The answer here is kind of all over the place, but see what you can make of it? It does have the same question but I have to see if that's enough for you to get the answer. – Sarvesh Ravichandran Iyer Apr 02 '24 at 06:57
  • Here are some attempts I made: since $0\leq X_k\leq b, Z=\sum_{k=1}^n X_k$, so $Z$ is $\frac{nb^2}{4}$-sub-Gaussian. By the Hoeffding's inequality, $\mathbb{P}[Z\geq (1+t)\mathbb{E}[Z]]\leq \exp (-\frac{2(1+t)^2\mathbb{E}[Z]^2}{nb^2})$.But I'm a bit lost on what to do next. – Joker Apr 02 '24 at 07:18
  • 1
    Welcome to MSE @Joker, please consider adding your attempts in the body of the question so that it avoids getting closed for lack of context (although, in my opinion, the question doesn't deserve to get closed as-is) – Stratos supports the strike Apr 02 '24 at 08:06
  • Thank you! I've changed it. – Joker Apr 02 '24 at 10:56
  • Joker, please have a look in the edit and the complete solution of questions 1 and 2. – Letac Gérard Apr 03 '24 at 19:34

1 Answers1

1

Without loss of generality we take $b=1.$ Let $X_1,\ldots,X_n$ independent random variables valued in $[0,1]$ Consider $Z=X_1+\cdots+X_n$ , $m=E(Z)$ and $a>1.$ We show that $$\Pr(Z>am)\leq \left (\frac{e^{a-1}}{a^a}\right)^m\ \ (*).$$ To prove it, for fixed $x\in[0,1]$ and $s>0$ we observe $f_x(s)=(1-x)+xe^s-e^{sx}\geq 0.$ This comes from the fact that $f_x(0)=0$ and $f_x'(s)=x(e^s-e^{-sx})>0.$ As a consequence if $m_i=E(X_i)$ we have for $s>0$ $$E(e^{sX_i})\leq E(1-X_i+X_i e^s)=1+m_i(e^s-1))\leq \exp (m_i(e^s-1)).$$ The Markov inequality says that for all $s$ $$\Pr(Z>am)=\Pr(e^{sZ}>e^{ams})\leq e^{-ams}E(e^{sZ})\leq \prod_{i=1}^n\exp[ -am_i s+m_i(e^s-1))]$$ choosing $s=\log a$ gives (*).

For the second part I don't understand it since it suggests a majorization of a probability by a number >1.