5

I quote Øksendal (2003).

Let us consider a probability space $\left(\Omega,\mathbb{P},\mathcal{A},\right)$ and a class of functions $f:\left[0,\infty\right]\times\Omega\mapsto\mathbb{R}$.
For $0\le S<T$, $\left(B(t)\right)_{t\ge0}$ a Brownian motion and $f(t,\omega)$ given, we want to define: $$\int_S^T f(t,\omega)dB(t)(\omega)$$ It is reasonable to start with a definition for a simple class of functions $f$ and then extend by some approximation procedure. First assume that $f$ has the form: $$\phi(t,\omega)=\sum_{j\ge0}e_j(\omega)\cdot1_{[j\cdot2^{-n}, (j+1)2^{-n})}(t)$$ where $1$ denotes the indicator function and $n$ is a natural number.
For such functions, it is reasonable to define: $$\int_S^T\phi(t,\omega)dB_t(\omega)=\sum_{j\ge0}e_j(\omega)\left[B_{t_{j+1}}-B_{t_j}\right](\omega)\tag{1}$$ where: $$t_k=t_k^{(n)}=\begin{cases}k\cdot 2^{-n}\hspace{0.3cm}\text{if } S\le k\cdot 2^{-n}\le T\tag{2}\\ S\hspace{0.3cm}\text{if } k\cdot 2^{-n}<S\\ T\hspace{0.3cm}\text{if } k\cdot 2^{-n}>T \end{cases}$$



My doubts concern the part in italics. Namely:

Questions

  1. Why, according to $(1)$: \begin{align}\int_S^T\phi(t,\omega)dB_t(\omega)&=\int_S^T\sum_{j\ge0}e_j(\omega)\cdot1_{[j\cdot2^{-n}, (j+1)2^{-n})}(t)dB_t(\omega)\\&=\sum_{j\ge0}e_j(\omega)\left[B_{t_{j+1}}-B_{t_j}\right](\omega)\end{align}?

    Is that due to the fact that $\sum_{j\ge0}e_j(\omega)\cdot1_{[j\cdot2^{-n}, (j+1)2^{-n})}(t)$ do not depend on the variable of integration $B_{t}(\omega)$, hence they go outside sign of integration and one has: \begin{align}\int_S^T\phi(t,\omega)dB_t(\omega)&=\int_S^T\sum_{j\ge0}e_j(\omega)\cdot1_{[j\cdot2^{-n}, (j+1)2^{-n})}(t)dB_t(\omega)\\&=\sum_{j\ge0}e_j(\omega)\left[B_{t_{j+1}}-B_{t_j}\right](\omega)\end{align} with $t_k$ as specified in $(2)$ and $\sum_{j\ge0}\left[B_{t_{j+1}}-B_{t_j}\right]=B_{T}-B_{S}$?
  2. Besides, could you please detail the reason why $(2)$ is defined that way? In particular, is there in place the choice of left-end point of every time interval? Why does the value $t_k$ depend on whether $k\cdot2^{-n}$ is positioned? What I would expect instead is something like: $$t_k=t_k^{(n)}=\begin{cases}t_k\hspace{0.4cm}\text{if } k\cdot 2^{-n}\le t_k \le (k+1)\cdot2^{-n}\tag{2.bis}\\ 0\hspace{0.5cm}\text{otherwise} \end{cases}$$
  • 1
    I am writing an answer, but prior to that, I just want to say that if you read the rest of that same chapter, you could get some insight into question $2$. The intention of stochastic calculus may not always coincide with the intention of usual calculus, so that particular observation of yours is important. – Sarvesh Ravichandran Iyer Nov 03 '20 at 05:10
  • 1
    (1) It is the definition of a stochastic integral for elementary functions w.r.t. a BM. (2) Your "definition" of $t_k$ doesn't make sense (it's circular). –  Nov 03 '20 at 11:23
  • OK, could you please just explain why is $(2)$ originally defined that way? I cannot understand the undelying "logic" @d.k.o. – Strictly_increasing Nov 03 '20 at 11:24
  • OK, I will try to get such an insight. I am sure your answer will definitely clarify my doubts, thank you @TeresaLisbon – Strictly_increasing Nov 07 '20 at 10:58
  • 1
    @Strictly_increasing While I was halfway to writing an answer to this question, the other answer came up, and to be honest I think it is fantastic, so I will not answer the question. Thanks for giving me the opportunity, though. I would still like to provide you with references : "Brownian motion and stochastic calculus" by Schilling and Partzsch is an excellent starting book : it covers both the Ito and Stratonovich integral in separate chapters. Then the introduction to Chapter $3$ of Karatzas-Shreve taks in detail about why this definition is reasonable. – Sarvesh Ravichandran Iyer Nov 08 '20 at 08:34
  • 1
    Oksendal is the book to read if you want a quick introduction to the concept of Stochastic calculus. But it is not comprehensive, and does not indulge in discussion beyond a certain point. Better books (i.e. harder to read, but more discussion-based and with better exercises) are the two books I referenced above. They will give you a solid foundation if you want to get into research in the subject as I have done. – Sarvesh Ravichandran Iyer Nov 08 '20 at 08:36

1 Answers1

1

(1) It is the definition of a stochastic integral for elementary functions w.r.t. a BM. (See the beginning of the next section.) Why is it reasonable? Consider a discrete-time analogue. Let $\{X_n\}$ be a martingale adapted to $\{\mathcal{F}_n\}$ and let $\{H_n\}$ be a bounded, previsible process, i.e., $H_n\in\mathcal{F}_{n-1}$. Then we define $$ (H\cdot X)_n:=\sum_{i=1}^n H_i \Delta X_i,\quad (H\cdot X)_0=0 $$
as our discrete time stochastic integral (in fact, it is called the martingale transform of $X$). The standard example is that if you bet 1\$ each time (i.e., $H_n=1$), your total gain/loss at time $n$ is exactly $(H\cdot X)_n$. A nice property of this process is that it is a martingale (is it crucial that $H$ is predictable; take, for example, $H_n=\operatorname{sgn}(\Delta X_n)$). The corresponding processes in your case are $H_n=e_{n-1}$ and $X_n=B_{t_n}$ (setting $S=0$).

(2) The "logic" behind the definition of $t_k$ is related to the definition of elementary functions. For each $k$, such a function is constant on $[k 2^{-n},(k+1)2^{-n})$ and $(B_t)$ is "sampled" at the corresponding end-points.

  • My question $(1)$ specifically refers to the "passage" $\int_S^T\sum_{j\ge0}e_j(\omega)\cdot1_{[j\cdot2^{-n}, (j+1)2^{-n})}(t)dB_t(\omega)=\sum_{j\ge0}e_j(\omega)\leftB_{t_{j+1}}-B_{t_j}\right$. Why does that hold true mathematically speaking? $$$$ As to my question $(2)$, what I mean is: why $t_k=S$ if $S>k2^{-n}$ and $t_k=T$ if $T<k2^{-n}$. That sounds "counterintuitive" to me (if I try, for example, to visually imagine the situation) – Strictly_increasing Nov 03 '20 at 22:54
  • (1) It holds by definition. (2) Why is it "counterintuitive"? To visually imagine the situation look at the graph of $t\mapsto 2^{-n}\lfloor t 2^n\rfloor$ between $S$ and $T$. –  Nov 03 '20 at 23:43
  • yeah, you said "Look at the graph of .... $\color{red}{\text{between }S \text{ and }T}$" and that sounds good to me. But the point is exactly that I cannot understand why one has to focus on the points $k2^{-n}$ outside the interval $[S,T]$. $$$$ What's the point in having a focus on $S>k2^{-n}$ (for which $t_k=S$) and on $T<k2^{-n}$ (for which $t_k=T$)? – Strictly_increasing Nov 07 '20 at 11:51
  • 1
    I see. The sequence ${k2^{-n}:k\ge 0}$ creates a partition of $\mathbb{R}{\ge 0}$. For small and large $k$, most of these points lie outside of $[S,T]$ and need to be truncated. This partition could be equivalently defined as follows: ${S, k{min}2^{-n}, \ldots, k_{max}2^{-n},T}$, where $k_{min}$ ($k_{max}$) is the minimal (maximal) $k$ s.t. $k2^{-n}>S$ ($k2^{-n}<T$). –  Nov 07 '20 at 13:04
  • I see. But doesn't this mean that for the integral $(1)$ you will consider a lot of points which are outside the bounds of integration $[S, T]$, hence "overestimating" the value of the integral itself? For that reason, I was sure that outside $[S,T]$ one should consider $t_k=0$, in order not to have "contributions" different from $0$ coming from outside the bounds of integration $[S,T]$ – Strictly_increasing Nov 07 '20 at 13:58
  • Just to give an example: I imagine that situation like if I have to compute: $$\int_{-1}^{0} \lfloor x \rfloor \tag{1.bis}$$ (which clearly corrsponds to $-1$) as if it corresponded to: $$\int_{-1}^{0} \lfloor x \rfloor =\int_{\mathbb{R}}\lfloor x \rfloor=\int_{-1}^{0} \lfloor x \rfloor + L+R=-1+L+R$$ with $L$ corresponding to the value of the integral of $\lfloor x\rfloor$ as if $x=-1$ constantly and $R$ corresponding to the value of the integral of $\lfloor x\rfloor$ as if $x=0$ constantly. But what would be the sense of doing that? Sorry, I just want to fully grasp the point. Thank you. – Strictly_increasing Nov 07 '20 at 14:12
  • 1
    No. Suppose that $t_k\le S$ for $k=0, \ldots, K$. Then $B_{t_{k+1}}-B_{t_{k}}=B_S-B_S=0$ for all $k=0,\ldots, K-1$ and so $\sum_{j=0}^{K-1}e_j(B_{t_{j+1}}-B_{t_{j}})=0$. –  Nov 07 '20 at 15:30