2

Let $X_t$ be a random variable $F_t-measurable$, and $\epsilon_t$ a standardized normal random variable independent of $F_t$. ($t\in R$)

Express $log(E[exp(xX_t\epsilon_t) | F_t ])$ in function of $X_t$ and $x$.

My thought were to rewrite it using the moment generating function of the normal distribution, i.e.,

$log(E[exp(xX_t\epsilon_t) | F_t ]) = log(E[exp(xX_t\epsilon_t)]) =log(M_Z(xX_t))$

And this is equal to $0.5(xX_t)^2$.

But how can I go from the conditional expectation to the unconditional one? I don't really know if I'm allowed to do that. Is there a better way to do this ?

Naxx17
  • 23
  • Exercise: If $X$ is $G$-measurable and $Y$ is independent on $G$, then, for every measurable function $u$ such that $u(X,Y)$ is integrable, $$E(u(X,Y)\mid G)=v(X)\qquad v(x)=E(u(x,Y)).$$ – Did Nov 20 '15 at 15:29
  • I've just red your answer here : http://math.stackexchange.com/questions/73353/conditional-expectation-of-function-of-two-rvs-one-measurable-one-independent

    However I cant understand why (2) must hold. Could you quickly elaborate ? Thanks

    – Naxx17 Nov 20 '15 at 19:46
  • "Quickly"? No (and why?). "Elaborate"? How to elaborate on a definition? Anyway, for questions about 73353, please post comments on 73353's page, not here. – Did Nov 20 '15 at 19:56

1 Answers1

0

Use that $$ \exp(tX_t\epsilon_t) = \sum_k \frac 1{k!} t^kX_t^k \epsilon_t^k$$ Now $X_t^k$ is $F_k$-measurable, hence $\def\E{\mathbf E}\E[X_t^k\epsilon_t^k \mid F_t] = X_t^k\E[X_t^k\mid F_t]$ and $\epsilon_t^k$ is $F_t$-independent, hence $\E[\epsilon_t^k\mid F_t] = \epsilon_t^k$. Therefore $$ \E[\exp(tX_t\epsilon_t)\mid F_t] = \sum_{k} \frac {t^k}{k!}X_t^k \E[\epsilon_t^k] = M_{Z}(tX_t) = \exp\left(\frac 12 t^2X_t^2\right)$$

martini
  • 86,011