The characteristic function of a random variable $X$ is defined as $\hat{X}(\theta)=\mathbb{E}(e^{i\theta X})$. If $X$ is a normally distributed random variable with mean $\mu$ and standard deviation $\sigma\ge 0$, then its characteristic function can be found as follows:
$$\hat{X}(\theta)=\mathbb{E}(e^{i\theta X}) =\int_{-\infty}^{\infty}\frac{e^{i\theta x-\frac{(x-\mu)^2}{2\sigma^2}}}{\sigma\sqrt{2\pi}}dx=\ldots=e^{i\mu\theta-\frac{\sigma^2\theta^2}{2}}$$
(to be honest, I have no idea what to put instead of the "$\ldots$"; I've looked here, but that's only for the standard case. Anyway, this is not really my question, even if it is interesting and might be relevant)
Now, if I got it right, a random Gaussian vector $X$ (of dimension $n$) is a vector of the form $X=AY+M$ where $A$ is any real square matrix $n\times n$, $Y$ is a vector of size $n$ in which each coordination is a standard normally distributed random variable, and $M$ is some (constant) vector of size $n$.
I am trying to find the characteristic function of such $X$. The generalization of the formula for characteristic functions to higher dimensions is straight-forward:
$$\hat{X}=\mathbb{E}(e^{i<\theta,X>}),$$ where $<.,.>$ here is an inner product. So I can start with the following:
$$\hat{X}(\theta) = \mathbb{E}(e^{i<\theta,X>}) = \mathbb{E}(e^{i<\theta,AY>}\cdot e^{i<\theta,M>})\\ =e^{i<\theta,M>}\cdot \mathbb{E}(e^{i<\theta,AY>}) $$
And I'm left with an expectation of a complex product of random variables. That probably means that the covariance matrix of some random variables should be involved, but that touches the boundaries of my knowledge about probability.