5

I have been reading Nualart's notes on Malliavin calculus and I am aware of his derivation of the integration by parts formula. We consider a Hilbert space $(H,\langle \cdot,\cdot\rangle )$. If we let $(B_{t})_{t\in [0,\infty)}$ be a $d$-dimensional Brownian motion and for $h\in H$ set $W(h)=\sum_{i=1}^{d}\int_{0}^{\infty}h_{i}(s)\text{d}B_{s}^{i}$. Then we have $$ \mathbb{E}\left[ \langle DF, h\rangle \right] = \mathbb{E}\left[ FW(h) \right] $$ where $DF$ is the weak derivative of $F$. The proof of this relies on choosing a suitable orthonormal system and using that things are normally distributed. However, on the wikipedia page (https://en.wikipedia.org/wiki/Malliavin_calculus) it is stated that it is possible to derive the same (perhaps in a less general setting) identity by Girsanov's theorem.

The details are omitted, so I have tried to fill them in myself, but I am not quite sure that I obtain the same formula as Nualart. The problem seems to me, to be that Girsanov's theorem concerns change of measure, whereas the above identity seems to be under the same probability measure $\mathbb{P}$. As on Wikipedia, we set $\varphi_{t} = \int_{0}^{t}h_{s}\text{d}\langle B\rangle_{s} = \int_{0}^{t}h_{s}\text{d}s$ for a suitable predictable process $h$.

I then use a Girsanov change of measure using the Doléans-Dade exponential $$ \mathcal{E}(\varepsilon h \star B)_{t} = e^{\varepsilon\int_{0}^{t}h_{s}\text{d}B_{s}-\varepsilon^{2}\frac{1}{2}\int_{0}^{t}h_{s}^{2}\text{d}s} $$ If we assume that the underlying filtration is the one generated by $B$, we then set (ignoring that the measures should be restricted to intervals) $$ \frac{\text{d}\mathbb{Q}}{\text{d}\mathbb{P}} = \mathcal{E}(\varepsilon h \star B)_{t} $$ then by Girsanov's theorem we have $$ \text{d}B_{t} = \varepsilon h_{t}\text{d}t + \text{d}B^{\mathbb{Q}}_{t} $$ where $B^{\mathbb{Q}}$ is a Brownian motion under the (equivalent) measure $\mathbb{Q}$. Then we would have \begin{align*} \mathbb{E^P}\left[ F(B)\mathcal{E}(\varepsilon h \star B) \right] &= \int_{\Omega}F(B)\mathcal{E}(\varepsilon h \star B)\text{d}\mathbb{P} \\ &= \int_{\Omega}F(B)\text{d}\mathbb{Q} \\ &= \int_{\Omega}F(B^{\mathbb{Q}} + \varepsilon\varphi)\text{d}\mathbb{Q} \\ &= \mathbb{E^Q}\left[ F(B^{\mathbb{Q}} + \varepsilon\varphi) \right] \end{align*} Differentiating this wrt. $\varepsilon$ and evaluating at $\varepsilon = 0$, we obtain the same kind of formula as in Nualart, but it seems to me that we are still taking expectations under different measures. Am I missing something, or is it even a problem?

  • I think you should not consider a new Brownian motion, just write $E^P(F(B)\mathcal E)=E^P(F(B+\epsilon \varphi))$ – Chaos Apr 05 '21 at 07:05
  • But how is $F(B)\mathcal{E} = F(B+\varepsilon \varphi)$? And how does that follow from the Girsanov theorem, as stated on Wikipedia? – user7924249 Apr 05 '21 at 07:48
  • The equality holds in law, not pointwise, you are missing the expected value! – Chaos Apr 05 '21 at 07:56
  • The Girsanov theorem says that $B+\epsilon \varphi$ is a Brownian motion under a suitable measure $Q$, and gives you the Radon-Nikodym derivative, i.e. $\mathcal E$. Then you know $E^Q(F(B)):=E^P(F(B)\mathcal E)= E^P(F(B+\epsilon\varphi))$ – Chaos Apr 05 '21 at 08:00
  • This is just another way of seeing it. $B^P$ is a Brownian motion under measure $P$ and $B^Q$ is a Brownian motion under measure $Q$, assume that instead of substracting something we are adding something, (let $\epsilon$ be negative) which is the same. Then you have that $B^P+\epsilon\varphi=B^Q$ which basically tells you that $B^P+\epsilon\varphi$ is a Brownian motion under measure $Q$, then since when we take expectation we are cared about the law of the object we have $E^P(B+\epsilon \varphi)=E^Q(B)$ and the latter is by definition $E^P(B\mathcal E)$ – Chaos Apr 05 '21 at 08:57
  • you can give a look to https://arxiv.org/abs/1003.1649 – Chaos Apr 05 '21 at 08:59
  • Woops, I accidentally deleted my previous comment. But thanks for the clarification, I think I see it now! I guess I was thinking too much about things pointwise, as you pointed out in a previous comment. Thanks for the reference! – user7924249 Apr 05 '21 at 09:01
  • 1
    I always get confused when applying the Girsanov's theorem (we are not alone, a very prominent figure in the field of Stochastic analysis once told us during a lecture that he always get lost when applying it) .The key is thinking in terms of the law of the object, rather than the object itself – Chaos Apr 05 '21 at 09:04

1 Answers1

2

Consider the Hilbert space $H$, and let
$$F=\varphi(I(h_1),...,I(h_n))$$ be a smooth Brownian functional (here $I(\cdot)$ denotes the Wiener integral, or the isonormal Gaussian process in Nualart's terminology).

Then perturb the Brownian trajectories in the direction of an element $h$ of the Cameron-Martin space to obtain

$$F_{\epsilon}=\varphi(I(h_1)+\epsilon\langle h_1,h\rangle_H ,...,I(h_n)+\epsilon\langle h_n,h\rangle_H ).$$

Taking the derivative w.r.t to $\epsilon$ and evaluating the expression at $\epsilon=0$ yields the following equality

$$\frac{dF_{\epsilon}}{d\epsilon}\bigg|_{\epsilon=0}=\langle DF,h\rangle.$$

Then we have the following

$$\mathbb E(\langle DF,h\rangle)=\mathbb E\left[\frac{dF_{\epsilon}}{d\epsilon}\bigg|_{\epsilon=0}\right]=\lim_{\epsilon\to 0}\frac{\mathbb E(F_{\epsilon})-\mathbb E(F)}{\epsilon}$$

we can do the latter since $\varphi$ is bounded.

At this point use the Girsanov Theorem which in our case tells that

$$\mathbb E(F_{\epsilon})=\mathbb E\left[F\exp\bigg\{\epsilon I(h)-\frac{1}{2}\|h\|^2_H\bigg\}\right]$$ plugging this in the expression above, using the linearity of the expectation and bringing the limit inside the expectation yields

$$\mathbb E(\langle DF,h\rangle)=\mathbb E(F I(h)).$$


The Girsanov theorem tell us that $B_t+\epsilon \int_0^th(s)ds$ under measure $P$ has the same law of $B_t$ under an absolutely continuous measure $Q$ with Radon-Nikodym derivative given by $\exp\bigg\{\epsilon I(h)-\frac{1}{2}\|h\|^2_H\bigg\}$. Notice that there are a lot of equivalent formulations of Girsanov's theorem, sometimes the translation and the density appears on the same side, sometimes you have the translation on one side and the density on the other (as in our case).

Chaos
  • 3,417