Let $X_1, \ldots, X_n$ be independent real random variables, with $n > 1$. Define the sample mean by :
\begin{equation*} \overline{X} = \frac{1}{n} \sum_{i=1}^n X_i \end{equation*}
Define the sample variance by :
\begin{equation*} S^2 = \frac{1}{n-1} \sum_{i=1}^n(X_i-\overline{X})^2 \end{equation*}
If, for each $i$, $X_i$ is normal with mean $\mu_i$ and variance $\sigma_i^2$, then we know from Cochran's theorem that $\overline{X}$ and $S^2$ are independent.
What about the converse? Suppose that $\overline{X}$ and $S^2$ are independent, does it follow that each $X_i$ must be normal with some mean and variance? Does the result change if we assume furthermore integrability or boundedness on each $X_i$ (or even $X_i$'s are identically distributed)?