3

I am familiar with the following interpolation property of Brownian Motion (this is essentially a theorem about a Brownian Bridge):

Theorem. Let $W$ be a standard Brownian Motion, and $0<t_1 < t < t_2$. Then the conditional distribution of $W_t$ given $W_{t_1}=x_1$ and $W_{t_2}=x_2$ is $N(\mu, \sigma^2)$ with $\mu = x_1 + \frac{t-t_1}{t_2-t_1}(x_2-x_1)$ and $\sigma^2 = \frac{(t_2-t)(t-t_1)}{t_2-t_1}$.

I have a good intuition for the mean: $\mu$ is the linear interpolation of the values $(t_1, x_1), (t_2, x_2)$ at time $t$.

However, I have no intuition for where this $\sigma^2$ comes from, other than it just pops out of the proof of the theorem. I've noticed $\sigma^2$ can be written as $(t_2-t_1)\alpha(1-\alpha)$ if we write $t$ as a convex combination $t = \alpha t_1+(1-\alpha)t_2$ if that helps.

My question: Is there an intuitive reason for this formula beyond "it just falls out in the proof"?

nullUser
  • 28,703

1 Answers1

3

To save on subscripts, I'll write $u$ for $t_1$ and $v$ for $t_2$.

You were right about the relevance of convex combinations. Define $$ Z:={v-t\over v-u}W_u+{t-u\over v-u}W_v. $$ It's straightforward, if tedious, to check that the covariance of $W_t-Z$ and $Z$ is $0$, and so $W_t-Z$ and $Z$ are independent. Moreover, $$ \operatorname{var}(Z)=u+{(t-u)^2\over v-u}, $$ and so (by independence) $$ \operatorname{var}(W_t-Z)=t-\operatorname{var}(Z)={(t-u)(v-t)\over v-u}. $$ The upshot is that $W_t$ decomposes as a (linear) funtion of $(W_u,W_v)$ plus an independent Gaussian random variable: $$ W_t=Z+X, $$ in which $X:=W_t-Z$ is independent of $Z$, and normal with mean $0$ and variance ${(t-u)(v-t)/(v-u)}$. It follows that the conditional distribution of $W_t$, given that $W_u=x_1$ and $W_v=x_2$, is normal with mean $$ {v-t\over v-u}x_1+{t-u\over v-u}x_2. $$ and variance ${(t-u)(v-t)/(v-u)}$.

John Dawkins
  • 29,845
  • 1
  • 23
  • 39
  • Can this be done without the following construction ? - like reading this doesn't help me build intuition why you took this approach. – 28ADY0901 May 03 '24 at 18:18
  • My approach was motivated by trying to make use of 3 things: (1) linear combinations of jointly normal random variables are normal; (2) correlation 0 implies independence for jointly normal random variables; (3) variances add for independent random variables. – John Dawkins May 03 '24 at 22:10
  • Given that we are given the values at the end-points of the interval $[u, v]$ can we do the following ? - $Pr[W_t \le x | W_u = U, W_v = V]= Pr[W_u + W_t - W_u \le x | W_u = U, W_v = V] = Pr[W_t - W_u \le x-U, W_v - W_u \ge V - x] $ ; if we differentiate with respect to $x$, we should retrieve the conditional density? – 28ADY0901 May 03 '24 at 22:23