1

The question is just a little variant of this post but I thought to be careful based on the minor differences.

Let $A$ be an $n\times n$ random matrix with independent and identically distributed entries sampled from $N(\mu,\sigma)$ and let $x$ be a (deterministic) vector in $\mathbb{R}^n$; here $\mu,\sigma \in \mathbb{R}$ with $\sigma>0$. What is the distribution of the random vector $$ Y= Ax. $$ From the linked post, I know that it is Gaussian $N(\mu',\Sigma')$, since ultimately, $Y$ is an affine transformation of a normal random vector (when mapping $A$ into a vector in $\mathbb{R}^{n^2}$). However, what are the actual quantities $\mu'$ and $\Sigma'$?

I think it should be $Y\sim \mathcal{N}\left(\boldsymbol{0}_n , x^{\top}x\right)$ however, I'm not sure if I've made a mistake...

AB_IM
  • 6,888

1 Answers1

1

Just use the definition.

For any entry $Y_i$ of $Y$ we have $$ Y_i = \sum_j A_{i,j} x_j $$ and since the mean is linear, $$ \mu'_i = \mathbb E[Y_i] = \sum_j \mathbb E[A_{i,j}] x_j = \mu\sum_j x_j $$

For the variance, notice that $Y_i$ and $Y_j$ are independent whenever $i\ne j$, so the covariance matrix is diagonal. If you're not convinced, recall that the covariance is bilinear, so $$ i\ne j\implies \Sigma_{i,j}' = Cov(Y_i,Y_j) = \sum_{s,k} Cov(A_{i,s},A_{j,k}) x_sx_k = 0 $$ On the diagonal, we have (since the entries of $A$ are independent) $$ \Sigma_{i,i}' = Cov(Y_i,Y_i) = Var(Y_i) = \sum_j Var(A_{i,j}) x_j^2 = \sigma^2\sum_j x_j^2 $$ so in the end you'd have $$ Y\sim \mathcal{N}\left(\mu\sum_j x_j \cdot \boldsymbol{1}_n , \sigma^2\sum_j x_j^2\cdot I_n\right) $$

Exodd
  • 12,241
  • 2
  • 27
  • 44
  • Ah this is very interesting. So $Y$ can be sampled component-wise by drawing from $\mathcal{N}(\mu\sum_j x_j,\sigma^2\sum_j x_j^2)$...very interesting! Thank you Exodd – AB_IM Nov 11 '20 at 12:22