2

The characteristic function of a random vector $\boldsymbol{X}$ is

$\varphi_{\boldsymbol{X}}(\boldsymbol{t}) =E[e^{i\boldsymbol{t}'\boldsymbol{X}}] $

Now suppose that $\boldsymbol{X} \in N(\boldsymbol{\mu},\boldsymbol{\Lambda})$. We observe that $Z = \boldsymbol{t}'\boldsymbol{X}$ has a one dimensional normal distribution. the parameters are $m = E[Z] = \boldsymbol{t}'\boldsymbol{\mu}$ and $\sigma^2 = \mathrm{Var}[Z] = \boldsymbol{t}'\boldsymbol{\Lambda}\boldsymbol{t} $($\boldsymbol{\Lambda} $ is the covariance matrix). since

$\textbf{(1)}\enspace \varphi_{\boldsymbol{X}}(\boldsymbol{t})=\varphi_{z}(1) = \exp\{im - \frac{1}{2}\sigma^2\}$

* my question is how does the equality in $ \textbf{(1)}$ end up being:

$\exp\{im - \frac{1}{2}\sigma^2\}$

I guess its a matter of algebra.Maybe someone could show me.

** clarification , $\boldsymbol{t}$ is a vector. therefore the multiplication above is $\boldsymbol{t}'\boldsymbol{X}= (t_1,t_z...t_n) \begin{pmatrix} X_1\\ X_2\\..\\X_n \end{pmatrix}$ where $t_i, i = 1,2..,n$ are real numbers

and the $\boldsymbol{t}'$ means the transpose of the vector $\textbf{t}$

Davide Giraudo
  • 181,608
Danny
  • 1,619

1 Answers1

7

Since you seem to be turning around this question and some of its variants again and again, let us try to answer it (almost) completely.

First, as mentioned partially by the text you are reading, to know the characteristic function of every normal random vector, it is enough to know the characteristic function of a standard one-dimensional normal random variable.

To wit, if $X$ is normal $(\mu,\Lambda)$, then for each $t$, the random variable $t'X$ is normal $(m,\sigma^2)$ with $m=t'\mu$ and $\sigma^2=t'\Lambda t$ hence $t'X=m+\sigma U$ where $U$ is standard normal. A consequence is that $$\varphi_X(t)=E(\mathrm e^{\mathrm it'X})=\mathrm e^{\mathrm im}E(\mathrm e^{\mathrm i\sigma U})=\mathrm e^{\mathrm im}\varphi_U(\sigma),$$ where $\varphi_U$ denotes the characteristic function of $U$. Thus, it is enough to compute $\varphi_U$ to know $\varphi_X$ for every normal random vector $X$.

Second, to compute $\varphi_U$, one uses the definition, that is, for every real number $s$, $$\varphi_U(s)=E(\mathrm e^{\mathrm isU})=\int_\mathbb R\mathrm e^{\mathrm isu}\mathrm e ^{-u^2/2}\frac{\mathrm du}{\sqrt{2\pi}}.$$ Note that $$\mathrm e^{\mathrm isu}\mathrm e ^{-u^2/2}=\mathrm e^{-s^2/2}\mathrm e ^{-(u-\mathrm is)^2/2},$$ hence $$\varphi_U(s)=\mathrm e^{-s^2/2}L(s),$$ where $$L(s)=\int_\mathbb R\mathrm e ^{-(u-\mathrm is)^2/2}\frac{\mathrm du}{\sqrt{2\pi}}.$$ Now, $L(0)=1$ since $L(0)$ is the integral of the standard normal PDF and it happens that $L(s)=L(0)$ for every real number $s$, thus, finally, $$\varphi_U(s)=\mathrm e^{-s^2/2},$$ and, going back to $X$, $$\varphi_X(t)=\mathrm e^{\mathrm im}\varphi_U(\sigma)=\mathrm e^{\mathrm im-\sigma^2/2}=\mathrm e^{\mathrm it'\mu-t'\Lambda t/2},$$ as desired.

To show that $L(s)=L(0)$ for every real number $s$, several methods are available, one of them, based on complex analysis, uses that the integral of the function $z\mapsto\mathrm e^{-z^2/2}$ over every closed path in the complex plane is zero.

Did
  • 284,245