Let $X \in \mathbb{R}^{k}$ be an $k$-dimensional random variable, such that $XX' \in \mathbb{R}^{k \times k}$ is invertible.
Does it always follow that $\mathbb{E}[XX']$ is invertible?
Let's assume that $\mathbb{E}[X^2_i] < \infty$ for each $i=1,\dots,k$.
I am then thinking of $\mathbb{E}[XX']$ as a $k \times k$ matrix filled with entries like $$ \begin{pmatrix} \mathbb{E}[X_1^2] & \mathbb{E}[X_1 X_2] &\dots &\mathbb{E}[X_1X_k] \\ \mathbb{E}[X_2 X_1] &&& \\ \vdots &&&\\ \mathbb{E}[X_kX_1] & \dots & & \mathbb{E}[X^2_k] \end{pmatrix} $$ However it's not clear to me that this matrix is always invertible, especially if we don't know the marginal and joint distributions so that the values inside can be anything.
This comes up in statistics in the context of linear regression, where some authors assume $XX'$ is invertible but then later on simply write $\mathbb{E}[XX']^{-1}$ where its implied that it exists.