I am trying to understand the solution to the following problem:
Let $\mathbf{x}, \mathbf{y}$ be two independent, uniformly distributed random unit vectors. Show that $\mathbb{E}\langle\mathbf{x}, \mathbf{y} \rangle^2 = \Theta(\frac{1}{n})$
The solution goes as follows:
Let $\mathbf{x}$ and $\mathbf{y}$ be two random unit vectors in $\mathbb{R}^n$. By rotational invariance, all random variables of the form $\langle \mathbf{x}, y\rangle^2$ have the same distribution, where $y$ is a unit vector $\mathbb{E}\langle \mathbf{x}, e_1 \rangle^2$ (i.e. possible realizations of $\mathbf{y})$. If follows that $\mathbb{E}\langle\mathbf{x}, \mathbf{y} \rangle^2 = \mathbb{E}\langle\mathbf{x}, e_1 \rangle^2$ where $e_1$ is the first coordinate vector[*].
Knowing that we can represent $\mathbf{x}$ in terms of a standard Gaussian vector, we write $\mathbf{x} = \frac{u}{\|u\|}$ where $\mathbf{u} \sim N(0, \mathrm{Id})$. Multiplying a Gaussian $ N(\mu, \Sigma)$ with a matrix $A$ gives another Gaussian $N(A\mu, A\Sigma A^\top)$. Therefore $\langle\mathbf{x}, e_1 \rangle^2 \sim N(0, 1)$. As a result, we want to compute the expectation of the random variable: $\mathbf{X} = \frac{\mathbf{u}_1^2}{\mathbf{u}_1^2+\mathbf{u}_2^2+\dots+\mathbf{u}_n^2}$ with $\mathbf{u}_i \stackrel{iid}{\sim} N(0,1)$.
The random variables $\mathbf{X}_i=\frac{\mathbf{u}_i^2}{\mathbf{u}_1^2+\mathbf{u}_2^2+\dots+\mathbf{u}_n^2}$ for $i \in [n]$ have the same distribution and therefore the same expectation. We have that $\sum_i \mathbf{X_i} = \frac{\mathbf{u}_1^2+\mathbf{u}_2^2+\dots+\mathbf{u}_n^2}{\mathbf{u}_1^2+\mathbf{u}_2^2+\dots+\mathbf{u}_n^2} = 1$. By linearity of expectation we conclude that $\mathbb{E}\mathbf{X}_1 = \frac{1}{n}$.
The part that I don't understand is the [*] one. How we can restrict $\mathbf{y}$ to be just some coordinate vector? Can anyone give a more intuitive explanation that maybe will involve some calculations using the rotational invariance?