2

in the book "ruin probabilities" by Asmussen and Albrecher, in Chapter II Thm. 1.2, the following statement is made:

Let $\{X_t\}$ be a Lévy process and $\alpha\in\mathbb{R}$. If $\mathbb{E}[e^{\alpha X_t}]$ is finite for all $t>0$, then it holds $\mathbb{E}[e^{\alpha X_t}]=e^{t\kappa(\alpha)}$ for some $\kappa(\alpha)\in\mathbb{R}$ and the process $e^{\alpha X_t-t\kappa(\alpha)}$ is a martingale.

In the proof they mention you have to choose $\kappa(\alpha)=\log\mathbb{E}[e^{\alpha(X_1-X_0)}]$.

Now I get, why the mentioned process is a martingale, but I somehow do not see, why $\mathbb{E}[e^{\alpha X_t}]=e^{t\kappa(\alpha)}$ should hold for that specific choice of $\kappa$. I mean for $t\in\mathbb{N}$ it's easy, but what about a general $t\in\mathbb{R}$?

Any hints are much appreciated!

TheBridge
  • 5,811
Barkas
  • 908
  • 1
    There is something odd about the initial condition, e.g. if, say $X_0=1$ and $t=0$ then $\mathbb{E}e^{\alpha X_t} \neq e^{t \kappa(\alpha)}$. Either you have to assume $X_0 =0$ or you need to replace $X_t$ by $X_t-X_0$. – saz Jun 19 '19 at 17:15
  • yeah I think so too. From the context, I'd assume it should read $X_t-X_0$. But still, how does the argument go for arbitrary $t$? – Barkas Jun 19 '19 at 17:20

1 Answers1

1

Fix $\alpha>0$. Define for all $y\geq0$: $$ f(y)=E[e^{\alpha (X_y-X_0)}]$$ Then from Lévy properties we can prove:

Property 1: $$ f(yn)=f(y)^n \quad \forall y\geq0,n\in\{1,2,3,...\}$$

From Property 1 alone we can prove:

Property 2: $$f(y/m)=f(y)^{1/m} \quad \forall y\geq0,m\in \{1,2,3,...\}$$

Now from Properties 1 and 2 we get for any rational number $n/m>0$ (for $n,m$ positive integers): $$ f(n/m)=f(1)^{n/m}$$ However properties of Lévy imply $f$ is continuous and so for any $y\geq0$ we can approximate $y$ with rational numbers to conclude $$f(y)=f(1)^y$$

TheBridge
  • 5,811
Michael
  • 26,378
  • As in the saz comment, if $X_0=0$ then $f(y)=E[e^{\alpha X_y}]$, which gives the book result. – Michael Jun 19 '19 at 21:14
  • Thank you very much! That's exactly the answer I was looking for! – Barkas Jun 20 '19 at 07:41
  • Do you have an elementary proof that $f$ is continuous? ("elementary" = without using deep results on Lévy processes) – saz Jun 21 '19 at 08:02
  • @saz : That question is hard. In my above answer I really just assumed $f$ was continuous. It suffices to prove $f$ is right-continuous at $0$. After some effort, I can prove it if either (i) $X_t\geq 0$ with probability 1 for all $t\geq 0$; (ii) $E[\sup_{t \in [0,1]} e^{\alpha X_t}]<\infty$. I don't know how to prove it in general. However define $h(x)=\log(f(x))$ we get $h(x+y)=h(x)+h(y)$ and it is known that such functions $h$ are either continuous at $x=0$, or unbounded on a sequence of positive numers $x_i$ that approach 0. – Michael Jun 21 '19 at 17:24
  • I posted a related question here: https://math.stackexchange.com/questions/3269965/are-nonnegative-levy-proceesses-almost-always-nondecreasing – Michael Jun 21 '19 at 17:49
  • @Michael I also thought about it for a while without finding a nice proof. Under certain additional assumptions (e.g. $E(\sup_{t \in [0,1]} e^{\alpha X_t})<\infty$ or $E(e^{\beta X_t})< \infty$ for some $\beta>\alpha$) it's relatively simple, but the general case is tricky. – saz Jun 21 '19 at 18:04
  • @saz : In your second case, do you really mean $E[e^{\beta X_t}]\leq c$ for all $t \in [0,1]$ for some $c<\infty$ and some $\beta > \alpha$? The weaker case $E[e^{\beta X_t}]<\infty$ for some $\beta>\alpha$ is not obvious to me. – Michael Jun 21 '19 at 18:38