In general, the eigenvalues of $A$ are insufficient to determine the eigenvalues of $L$. As an example, consider the cospectral matrices
$$
A = \left(\begin{array}{cc} 1 & 0\\ 0 & 2 \end{array}\right), \quad
B = \left(\begin{array}{cc} 1&1\\0&2 \end{array}\right).
$$
We find that the eigenvalues of $X \mapsto AX^TA$ are $\pm 2, 1, 4$, whereas the eigenvalues of $X \mapsto BX^TB$ are $\pm 2, 3 \pm \sqrt{5}$. The corresponding eigenvectors of $B$ are
$$
-2: \pmatrix{1&-3\\3&1}\qquad
2: \pmatrix{-1&-1\\-1&1}\\
3 - \sqrt{5}:
\pmatrix{3\sqrt{5} + 7 & -\sqrt{5}-1\\
-2\sqrt{5}-4&2}\qquad
3 + \sqrt{5}:
\pmatrix{7-3\sqrt{5} & \sqrt{5}-1\\
2\sqrt{5}-4& 2}
$$
It is interesting that the eigenvectors associated with $3 \pm \sqrt{5}$ are non-invertible.
If $A$ is real and symmetric, then the eigenvalues of $L$ can be deduced. In particular, $L$ will have eigenvalues $\lambda_j$ with associated eigenvector $xx^T$ (where $Ax = \lambda_j x$) and $\pm \lambda_j\lambda_k$ with associated eigenvector $xy^T \pm yx^T$ (where $Ax = \lambda_j x$ and $Ay = \lambda_k y$).
Note 1: From experiment, it seems that $\pm \det(A)$ is an eigenvalue in the case that $A$ has size $2 \times 2$; I'm not sure why this would be the case.
Note 2: By the result described on this post, a sum of eigenspaces that contains no invertible elements can have dimension at most $n^2 - n$. One the other hand, by user1551's comment, it holds that if $X$ is an invertible eigenvector of $L$ with eigenvalue $\lambda$, then it must be that $\lambda^{n} = \det(A)^2$. I suspect that these ideas can be combined in an interesting way.