4

Given a Ito-SDE $$dX_t=a(t,X_t)dt+b(t,X_t)dW_t$$ fulfilling the Lipshitz condition and linear growth condition. I want to proof $$E\left[\sup_{t_0\le s \le T} |X_s|^{2n}\right]\le D \cdot (E[|X_{t_0}|^{2n}]+(1+E[|X_{t_0}|^{2n}]) \cdot (T-t_0)^n \exp(C(T-t_0)))$$ with some constant D. My suggestion is using Doob Inequality. However the Doob inequality is only valid for the time range $[0,T]$ and here we have $[t_0,T]$

Maybe following Theorem should be helpful:

$$E[|X_t|^{2n}] \le (1+E[|X_{t_0}|^{2n}]) \exp(C(t-t_0))$$ and $$E[|X_t-X_{t_0}|^{2n}]\le D \cdot (1+E[|X_{t_0}|^{2n}]) \cdot (t-t_0)^n \cdot \exp(C(t-t_0))$$

saz
  • 123,507
  • Since $(X_t)_{t \geq 0}$ is, in general, not a (sub)martingale you cannot apply Doob's inequality. – saz Feb 27 '17 at 15:12
  • @saz But I can decompose the |X_s| term some how so that I have a deterministic term and with a stochastic integral, then I can apply Doob inequality on the stochastic integral. I just don't know how to decompose it – quallenjäger Feb 27 '17 at 15:20
  • Note that the drift term $a(t,X_t) , dt$ is not deterministic. So in general there is no such decomposition. – saz Feb 27 '17 at 15:21
  • @saz how am I suppose to do it? My thought was to bring it on the form of $blablabla*dt+ sup blablabla dW_t$ where I can eliminate the sup in the dt term and left it in dW_t term – quallenjäger Feb 27 '17 at 15:25
  • Do you have a reference for the second theorem, i.e., moment estimate of $|X_t - X_{t_0}|$? – xan Apr 09 '21 at 09:59
  • @quallenjäger Could you elaborate on where you took those results from? – Akira Apr 14 '23 at 08:35

1 Answers1

5

Let us first assume that $X_{t_0}=0$. By Itô's formula, we have

$$\begin{align*} &|X_t|^{2n} \\ &= 2n \int_{t_0}^t |X_s|^{2n-1} b(s,X_s) \, dW_s +n\int_{t_0}^t |X_s|^{2n-2} \left( (n-1) b(s,X_s)^2 + 2 a(s,X_s) |X_s| \right) \, ds \\ &=: M_t + N_t.\end{align*}$$

Clearly,

$$\mathbb{E} \left( \sup_{t_0 \leq t \leq T} |X_t|^{2n} \right) \leq \mathbb{E} \left( \sup_{t_0 \leq t \leq T} |M_t| \right) + \mathbb{E} \left( \sup_{t_0 \leq t \leq T} |N_t| \right). \tag{1}$$

We estimate both terms separately. For the first term we can use $(|M_t|)_{t \geq 0}$ is a submartingale and Itô's isometry to conclude

$$\mathbb{E} \left( \sup_{t_0 \leq t \leq T} |M_t| \right) \leq \mathbb{E}(|M_T|) \leq \sqrt{\mathbb{E}(M_T^2)} = 2n \sqrt{\int_{t_0}^T \mathbb{E}(|X_s|^{4n-2} b(s,X_s)^2) \, ds}.$$

Since $b$ is of at most linear growth, we find that there exists a constant $C_1>0$ such that

$$\mathbb{E} \left( \sup_{t_0 \leq t \leq T} |M_t| \right) \leq C_1 \sqrt{\int_{t_0}^T \mathbb{E}(1+|X_s|^{4n}) \, ds}.$$

Now you can use the upper bound for $\mathbb{E}(|X_s-X_{t_0}|^{4n})= \mathbb{E}(|X_s|^{4n})$ which you mentioned at the very end of your question to obtain a suitable estimate for the first term on the right-hand side of $(1)$. For the second one, we note that by the linear growth condition

$$\mathbb{E} \left( \sup_{t_0 \leq t \leq T} |N_t| \right) \leq C_2 \mathbb{E} \left[ \int_{t_0}^T (1+|X_s|^{2n}) \, ds \right].$$

Again the upper bound for $\mathbb{E}(|X_s-X_{t_0}|^{2n})$ provides a suitable estimate for this term, and this proves the desired inequality.

For the general case, i.e. if $X_{t_0} \neq 0$, just note that, by Hölder's inequality,

$$|X_t|^{2n} = |X_t-X_{t_0} + X_{t_0}|^{2n} \leq c |X_t-X_{t_0}|^{2n} + c |X_{t_0}|^{2n}$$

for a suitable constant $c>0$. Consequently,

$$\mathbb{E} \left( \sup_{t_0 \leq t \leq T}|X_t|^{2n} \right) \leq c \mathbb{E}(|X_{t_0}|^{2n}) + c \mathbb{E} \left( \sup_{t_0 \leq t \leq T} |X_t-X_{t_0}|^{2n} \right)$$

and now the claim follows from the first part of the proof.

saz
  • 123,507
  • Thanks! what did you do with $sup |X_t-X_{t0}|$ is it a martingale? How did u get the sup away in the general case you mentioned at the end. – quallenjäger Mar 01 '17 at 20:02
  • @quallenjäger No, it's not a martingale. For the second term (in the general case) we can use the first part of the proof; if we set $Y_t := X_t-X_{t_0}$ and then the first part of the proof gives the upper bound for $$\mathbb{E}(\sup_t |X_t-X_{t_0}|^{2n}) = \mathbb{E}(\sup_t |Y_t|^{2n}).$$ – saz Mar 01 '17 at 20:17
  • And if I use the linear growth condition. In $M_t$ Term $|X_s^{4n-2}b^2|$ becomes to $|X_s|^{4n-2}*(1+|X_s|^2)$ in the expectation value. Then I will have $|X_s|^{4n-2}+|X_s|^{4n}$. How do you get $1+|X_s|^{4n}$? – quallenjäger Mar 01 '17 at 21:16
  • @quallenjäger If $|X_s(\omega)| \leq 1$ we have $$|X_s(\omega)|^{4n-2} + |X_s(\omega)|^{4n} \leq 1+ |X_s(\omega)|^{4n};$$ if $|X_s(\omega)|>1$, then $|X_s(\omega)|^{4n-2} \leq |X_s(\omega)|^{4n}$ and therefore $$|X_s(\omega)|^{4n-2} + |X_s(\omega)|^{4n} \leq 2 |X_s(\omega)|^{4n} \leq 2 (|X_s(\omega)|^{4n}+1).$$ This shows $$|X_s(\omega)|^{4n-2} + |X_s(\omega)|^{4n} \leq 2 (|X_s(\omega)|^{4n}+1)$$ for all $\omega$. – saz Mar 02 '17 at 06:52