I read on my textbook that:
"The distribution of $X$ is concentrated on the points $0$ and $1$ if $E(X)=E(X^2)$.
Is this statement implies that always $X$ is going to be a Bernoulli random variable?, and if is it true, why?
I read on my textbook that:
"The distribution of $X$ is concentrated on the points $0$ and $1$ if $E(X)=E(X^2)$.
Is this statement implies that always $X$ is going to be a Bernoulli random variable?, and if is it true, why?
No, it is not necessarily a Bernoulli distribution. For example, let $X \sim N\left(\frac{1}{2}, \frac{1}{4}\right)$. Then $E[X] = E\left[X^2\right] = \frac{1}{2}$. In fact it would hold for any distribution such that $0 < \mu < 1$ and $\text{Var}(X) = \mu-\mu^2$, where $\mu$ is the mean.
As pointed out by other answers/comments, the claim is not true without extra assumptions. Here are some comments:
Given that the counterexamples mentioned involve both positive and negative values, one might suspect that adding the condition $X\geq 0$ might salvage the claim. However, this is still not enough since we have the following example:
If $X \sim \operatorname{Uniform}([\frac{1}{2},\frac{1+\sqrt{3}}{2}])$, then $ \mathbb{E}[X] = \mathbb{E}[X^2] = \frac{2+\sqrt{3}}{4}$.
If we impose the condition that $0 \leq X \leq 1$, then the claim is true by noting that $X(1-X)$ is a non-negative and $\mathbb{E}[X(1-X)] = 0$. This implies that $X(1-X) = 0$ a.s. and hence $X$ is Bernoulli.
In this posting, it is shown that the condition $\mathbb{E}[X^2]=\mathbb{E}[X^3]=\mathbb{E}[X^4]<\infty$ is enough to guarantee that $X$ is Bernoulli.