$\newcommand{\var}{\operatorname{var}}$ $\newcommand{\E}{\mathbb E}$
I will consider the geometric distribution supported on the set $\{0,1,2,3,\ldots\}$. This is the distribution of the number $X$ of failures before the first success in a sequence of independent Bernoulli trials. Call the probability of success on each trial $p$.
Then
$$
X = \begin{cases} 0 & \text{with probability }p \\
1 & \text{with probability }p(1-p) \\
2 & \text{with probability }p(1-p)^2 \\
3 & \text{with probability }p(1-p)^3 \\
\vdots & {}\qquad \vdots \end{cases}
$$
Memorylessness of this distribution means that $\Pr(X\ge w+x\mid X\ge w)=\Pr(X\ge x)$, i.e. the probability distribution of the number of remaining trials, given the number of failures so far, does not depend on the number of failures so far.
Let $\displaystyle A=\begin{cases} 1, & \text{if }X\ge 1 \\[6pt] 0, & \text{if }X=0. \end{cases}$
Then
$$
\E(X) = E(E(X\mid A)) = E\left.\begin{cases} 0 & \text{if }A=0 \\ 1+\E(X) & \text{if }A=1 \end{cases}\right\} = p\cdot0+(1-p)(1+\E(X)).
$$
Thus we have
$$
\E(X) = 1-p+(1-p)\E(X).
$$
Therefore
$$
\E(X) = \frac{1-p}{p}.
$$
Now the variance:
$$
\var(X) = \var(\E(X\mid A)) + \E(\var(X\mid A))
$$
$$
= \var\left.\begin{cases} 0 & \text{if }A=0 \\ 1 + \E(X) & \text{if }A=1 \end{cases}\right\} + \E\left.\begin{cases} 0 & \text{if }A=0 \\ \var(X) & \text{if }A=1 \end{cases}\right\}
$$
$$
= \var\left.\begin{cases} 0 & \text{if }A=0 \\ 1/p & \text{if }A=1 \end{cases}\right\} + p\cdot0 + (1-p)\var(X)
$$
$$
= \frac{1-p}{p} + p\cdot0 + (1-p)\var(X) = (1-p)\left(\frac1p+\var(X)\right).
$$
So we get
$$
\var(X) = (1-p)\left(\frac1p+\var(X)\right).
$$
Therefore
$$
\var(X) = \frac{1-p}{p^2}.
$$
variance = 0 if A = 0andvariance = 1/p if A = 1to get $\frac{(1-p)}{p}$? – 1110101001 Jun 11 '15 at 08:33How did you combine the two cases of variance into $\frac{1-p}{p}$
– 1110101001 Jun 11 '15 at 18:34