4

In another post an inequality referred to as "Etemadi's Inequality" is mentioned twice - in the original post as well as in the answer. However, the contexts of usage are such as to raise the question whether the inequality intended by the users (ziT and saz, respectively) is the inequality that goes by the same name as features on Wikipedia.

More specifically, according to the Wikipedia entry mentioned above, Etemadi's inequality is the following statement (proved as Theorem 22.5 on p. 288 of Billingsley's "Probability and Measure:, 3rd ed. (John Wiley & Sons, 1995)).

If $X_1, X_2, \dots, X_n$ are independent real-valued random variables defined on some common probability space, then, setting $S_k := \sum_{i=1}^k X_i$ ($k \in \{1, 2, \dots, n\}$), the following holds for every $\alpha \geq 0$: $$ \mathbb{P}\left(\max_{k \in \{1, 2, \dots, n\}}\left|S_k\right| \geq 3\alpha\right) \leq 3 \max_{k \in \{1, 2, \dots, n\}} \mathbb{P}\left(\left|S_k\right|\geq \alpha\right) $$

According to ziT,

Given a Lévy process $(X_t)_{t \in [0,\infty)}$ in $\mathbb{R}^d$ ($d \in \{1, 2, \dots\}$), then, setting $X^*_t := \sup_{s\in [0,t]}\left|X_s\right|$, if $b>0$ is such that $P[X_{t}^{*}\leq b/2]>0$, we have by Etemadi's inequality for every $a, b > 0$: $$ P[X_{t}^{*}>a+b]\leq \frac{P[|X_{t}|>a]}{P[X_{t}^{*}\leq b/2]} $$

According to saz,

Given a Lévy process $(X_t)_{t \in [0,\infty)}$ in $\mathbb{R}^d$ ($d \in \{1, 2, \dots\}$), then, setting $X^*_t := \sup_{s\in [0,t]}\left|X_s\right|$, if $b>0$ is such that $\mathbb{P}(X_t^* \leq b/2)>0$, we get by Etemadi's inequality for $a=kb$, $k \in \{1, 2, \dots\}$: $$ \mathbb{P}(X_t^* > k b) \leq c \mathbb{P}(|X_t| > (k-1)b) $$ with $c:= 1/\mathbb{P}(X_t^* \leq b/2)$.

My questions is:

Can the Wikipedia version of Etemadi's inequality be used to derive the conclusions mentioned by ziT and by saz? If not, what might be the proposition referred to as "Etemadi's inequality" by ziT and by saz?

Evan Aad
  • 11,818

1 Answers1

8

The inequality which both ziT and I used is a direct consequence of the following inequality.

Lemma 1: Let $X_1,\ldots,X_n$ be independent random variables and $S_k := \sum_{j=1}^k X_j$, $k=1,\ldots,n$. Then for any $a,b \geq 0$ $$\mathbb{P} \left( \max_{1 \leq j \leq n} |S_j|>a+b \right) \leq \frac{\mathbb{P}(|S_n|>a)}{\mathbb{P} \left( \max_{1 \leq j \leq n} |S_j| \leq b/2 \right)}. \tag{1}$$ Here (and throughout this answer) we use the convention $1/0=\infty$.

The proof is very similar to the proof of Etemadi's inequality, but as far as I can see Lemma 1 is not a direct consequence of Etemadi's inequality (see the remark below).

Proof: Fix $a,b \geq 0$. For the disjoint sets $$A_j := \left\{ \max_{1 \leq k <j} |S_k| \leq a+b, |S_j| > a+b \right\}, \qquad j=1,\ldots,n$$ we have $$ \left\{ \max_{1 \leq j \leq n} |S_j| > a+b \right\} = \bigcup_{j=1}^n A_j.$$ Consequently, by the independence of the random variables, $$\begin{align*} \mathbb{P}\left( \max_{1 \leq j \leq n} |S_j| > a+b \right) &\leq \mathbb{P}(|S_n| > a) + \sum_{j=1}^{n-1} \mathbb{P}(A_j \cap \{|S_n| \leq a\}) \\ &\leq \mathbb{P}(|S_n| > a) + \sum_{j=1}^{n-1} \mathbb{P}(A_j) \mathbb{P}(|S_n-S_j|>b) \\ &\leq \mathbb{P}(|S_n| > a) + \mathbb{P}\left( \max_{1 \leq j \leq n} |S_j| > a+b \right) \max_{1 \leq j \leq n} \mathbb{P}(|S_n-S_j|>b) \tag{2} \end{align*}$$ Hence,

$$ \mathbb{P}\left( \max_{1 \leq j \leq n} |S_j| > a+b \right) \left(1-\max_{1 \leq j \leq n} \mathbb{P}(|S_n-S_j|>b) \right) \leq \mathbb{P}(|S_n| > a). \tag{3}$$

As

$$\max_{1 \leq j \leq n} \mathbb{P}(|S_n-S_j|>b) \leq \mathbb{P} \left( \max_{1 \leq j \leq n} |S_j| > \frac{b}{2} \right)$$

we have

$$1- \max_{1 \leq j \leq n} \mathbb{P}(|S_n-S_j|>b) \geq \mathbb{P} \left( \max_{1 \leq j \leq n} |S_j| \leq \frac{b}{2} \right).$$

Using this estimate and $(3)$, the claim follows.

Remark: If we use in $(2)$ the estimate

$$\max_{1 \leq j \leq n} \mathbb{P}(|S_n-S_j| >b) \leq 2 \max_{1 \leq j \leq n} \mathbb{P}(|S_j|>b/2)$$ and the fact that $$\mathbb{P} \left( \max_{1 \leq j \leq n} |S_j|>a+b \right) \leq 1,$$ then the we get Etemadi's inequality (choose $a= \alpha$, $b=2\alpha$):

$$\mathbb{P} \left( \max_{1 \leq j \leq n} |S_j|>3\alpha \right) \leq 3 \max_{1 \leq j \leq n} \mathbb{P}(|S_j| > \alpha).$$

Lemma 2: Let $(X_t)_{t \geq 0}$ be a Lévy process and $X_t^* := \sup_{s \leq t} |X_s|$ the running maximum. Then for any $a,b>0$ and $t>0$

$$\mathbb{P}(X_t^* > a+b) \leq \frac{\mathbb{P}(|X_t|>a)}{\mathbb{P}(X_t^* \leq b/2)}.$$

Proof: For fixed $n \in \mathbb{N}$ and $t>0$ set $t_j := t \frac{j}{2^n}$, $j \in \{0,\ldots,2^n\}$. The random variables $(X_{t_j}-X_{t_{j-1}})_{j=1,\ldots,2^n}$ are independent, and therefore it follows from Lemma 1 that

$$\mathbb{P}(X_{(n)}^* > a+b) \leq \frac{\mathbb{P}(|X_t|>a)}{\mathbb{P}(X_{(n)}^* \leq b/2)}$$

where $X_{(n)}^* := \max_{1 \leq j \leq 2^n} |X_{t_j}|$. Since $(X_t)_{t \geq 0}$ has (almost surely) càdlàg sample paths, the claim follows by letting $n \to \infty$.

Lemma 3: Let $(X_t)_{t \geq 0}$ be a Lévy process and $X_t^* := \sup_{s \leq t} |X_s|$. Then $$\mathbb{P}(X_t^* > kb) \leq c \mathbb{P}(|X_t|>(k-1)b)$$ for any $b,k \geq 0$ where $c := 1/\mathbb{P}(X_t^* \leq b/2)$.

Proof: This follows directly from Lemma 2 applied with $a:= (k-1)b$.

saz
  • 123,507
  • Thank you. I apologize for my delayed reply; I've only gotten to read it now. The proofs are very understandable. Just a couple comments. (1) The use of lemma 2 in the proof of lemma 3 is only justified as long as $(k-1)b>0$. In order for this to hold, $k$ must be stipulated to be $\geq 2$. One way to avoid this caveat is to allow $a$ to be zero in the conditions of lemmas 1 and 2. In fact, there's no harm in allowing $a$, $b$ and $k$ to be any real number without any constraints: all three statements as well as proofs will remain valid. – Evan Aad Jul 16 '16 at 15:30
  • (2) The conditions of lemmas 1 and 2 should stipulate that the denominator on the right hand side not be zero. Likewise the denominator in the definition of $c$ in lemma 3. – Evan Aad Jul 16 '16 at 15:30
  • 1
    @EvanAad Thanks a lot for your feedbeck :). I'm going to incorporate your comments in the next days. – saz Jul 16 '16 at 17:22
  • I need to ask you something –  Nov 13 '18 at 19:12
  • @itry Well, then go ahead...? – saz Nov 13 '18 at 19:12
  • Do you think it's a good idea to go into stochastik with only"good" grades? –  Nov 13 '18 at 19:15
  • @itry What exactly do you mean by "go into stochastik"? Writing a master thesis or something? – saz Nov 13 '18 at 19:17
  • Doing a masters at all in the field? –  Nov 13 '18 at 19:19
  • @itry It is pretty impossible to give you a reasonable advice without knowing more about you and your background (I have no idea why you are asking me...). It is certainly possible to do a master in stochastics with "only good" grades. In the end, the important thing is that you feel comfortable with probability theory and that you show interest in the topic – saz Nov 13 '18 at 19:34
  • Is there any other way to talk to you? –  Nov 13 '18 at 19:41
  • @itry Unless you are living in Southern France, this is going to be rather difficult. But again: Why me? Why not ask one of your professors (e.g. stochastics) about this? They will be able to judge this much better than me. – saz Nov 13 '18 at 19:47
  • What is the argument for the "Remark"? I've basically asked that question here: https://math.stackexchange.com/questions/3704769/proving-etemadis-inequality – dmh Jun 04 '20 at 02:47