1

From Mathematische statistiek by A. van der Vaart:

"Let $X_1,X_2,\ldots,X_n$ be a random sample from the distribution function $x\rightarrow p\Phi(x-\mu)+(1-p)\Phi\bigl( (x-\nu)/\sigma\bigl)$. The parameters $p\in [0,1],\mu,\nu\in\mathbb{R}$ and $\sigma\in (0,\infty)$ are unknown. Construct a moment estimator for $(p,\mu,\nu,\sigma)$ and show that it is asymptotically normal."

We have to have an expectation, so what I did is taking the derivative of this distribution function to get a density. Then $E(X)$ is equal to $\int_{-\infty}^{\infty} x f(x) dx$ with $f$ the found density. It turns out this is equal to $\mu p + \frac{(1-p)\nu}{\sigma^2}$. But then I get stuck because this function in $(p,\mu,\nu,\sigma)$ is not injective so it is not possible to get a moment estimator. Then I tried $E(X^2)$ but that gives nothing better. I also don't see a link between $E(X)$ and $E(X^2)$, so interpreting it as a system of equations and eliminating also doesn't work.

Can anyone see a solution? I thank in advance, because I can't thank in the comments.

1 Answers1

1

$$\newcommand{\e}{\operatorname{E}}$$ Suppose $\Pr(Y=1) = p$ and $\Pr(Y=0) = 1-p.$

Suppose $X\mid (Y=1) \sim N(0,\mu)$ and $X\mid (Y=0) \sim N(\nu,\sigma^2).$

Then $X$ has just the distribution that was given.

$$ \e(X^n) = \e(X^n\mid Y=1)\Pr(Y=1) + \e(X^n\mid Y=0)\Pr(Y=0). $$

Use that to find the first four moments.

The first moment should be $p\mu+(1-p)\nu.$

If the whole system of four functions of those four parameters is injective, then the estimator exists. You shouldn't expect any one of those four to be injective; only the whole system.

Postscript:

Let's look at $\e(X^n\mid Y=0).$ \begin{align} \e(X^n\mid Y=0) & = \e( (\nu + \sigma Z)^n) \text{ where } Z\sim N(0,1). \end{align} One can expand $(\nu+\sigma Z)^n$ by the binomial theorem. If $n=4,$ this is $\nu^4 + 4\nu^3\sigma Z + 6\nu^2\sigma^2 Z^2 + 4\nu\sigma^3 Z^3 + \sigma^4 Z^4.$

So the problem is reduced to that of finding $\e(Z^k).$ $$ \e(Z^k) = \frac1 {\sqrt{2\pi}} \int_{-\infty}^\infty z^k e^{-z^2/2} \, dz = 0 \text{ if $k$ is odd, by symmetry.} $$ So now assume $k$ is even. We will need only need $k=2$ and $k=4.$ \begin{align} \e(Z^k) & = \frac1 {\sqrt{2\pi}} \int_{-\infty}^\infty z^k e^{-z^2/2} \, dz \\[10pt] & = \sqrt{\frac 2 \pi} \int_0^\infty z^{k-1} e^{-z^2/2} (z\,dz) = \sqrt{\frac 2 \pi} \int_0^\infty \left(\sqrt{2u} \, \right)^{k-1} e^{-u} \, du \\[10pt] & = \frac{2^{k/2}}{\sqrt{\pi}} \int_0^\infty u^{((k+1)/2) \, - \, 1} e^{-u} \, du = \frac{2^{k/2}}{\sqrt{\pi}} \Gamma\left( \frac{k+1} 2 \right). \end{align} If $k=4$ then we have $$ \frac 4 {\sqrt \pi} \Gamma\left( \frac 5 2 \right) = \frac 4 {\sqrt \pi} \cdot \frac 1 2 \cdot \frac 3 2 \Gamma\left( \frac 1 2 \right) = 3 \qquad \text{ since } \Gamma\left( \frac 1 2 \right) = \sqrt \pi. $$ Thus we have $\e(Z^4) = 3.$

It is widely known that $\e(Z^2) = 1.$ So $$ \e((\nu+\sigma Z)^4) = \nu^4 + 6 \nu^2 \sigma^2 \e(Z^2) + \sigma^4 \e(Z^4) = \nu^4 + 6\nu^2\sigma^2 + 3\sigma^4. $$

  • Then I think it's necessary to evaluate integrals of the form $\int_{-\infty}^{\infty} x^n exp(-(1/2)x^2)$, isn't it? Or is there a simpler way? – Rocco van Vreumingen Aug 19 '17 at 20:49
  • 1
    @RoccovanVreumingen : I've added a postscript on how to deal with that. – Michael Hardy Aug 19 '17 at 23:20
  • Before I take a look at that, I think it has to do sth with moment generating functions. Now I'm gonna take a look. – Rocco van Vreumingen Aug 20 '17 at 08:33
  • 1
    $$ \begin{align} \text{Summary:} \qquad \ \ \e(X\mid Y=1) & = \mu \ \e(X^2\mid Y=1)& = \mu^2 + 1^2 \ \e(X^3\mid Y=1) & = \mu^3 + 3\mu\cdot 1^2 \ \e(X^4\mid Y=1) & = \mu^4 + 6\mu^2 \cdot 1^2 + 3\cdot 1^4 \ \ \e(X\mid Y=0) & = \nu \ \e(X^2\mid Y=0) & = \nu^2 + \sigma^2 \ \e(X^3\mid Y=0) & = \nu^3 + 3\nu\sigma^2 \ \e(X^4\mid Y=0) & = \nu^4 + 6\nu^2 \sigma^2 + 3\sigma^4 \ \ p\mu + (1-p) \nu & = \text{first moment} \ p(\mu^2 + 1) + (1-p)(\nu^2 + \sigma^2) & = \text{second moment} \end{align} $$ And now I'm running out of space, but the idea is there. – Michael Hardy Aug 20 '17 at 19:12
  • It seems like a long time ago I have already had to know that $\Gamma (1/2) = 1$, because the proof (seen at google) is not straightforward and this exercise uses this fact. But of course I believe it now. There is sth else that I would mention: the 3rd moment exists because of the 4th, but symmetry is not enough isn't it, because $\infty-\infty$ – Rocco van Vreumingen Aug 20 '17 at 19:35
  • 1
    No I mean $\Gamma (1/2) = \sqrt{\pi}$, that is a result that I've seen first now. – Rocco van Vreumingen Aug 20 '17 at 19:50
  • 1
    @RoccovanVreumingen : Look at this question: https://math.stackexchange.com/questions/215352/why-is-gamma-left-frac12-right-sqrt-pi – Michael Hardy Aug 20 '17 at 20:16
  • $\displaystyle \int_{-\infty}^\infty \Big( \text{an odd function} \Big) , dx = 0$ $\text{if } \displaystyle \int_{-\infty}^\infty \Big| \text{the same function} \Big| , dx < \infty.$ So symmetry is not enough without the condition following the word "if". $\qquad$ – Michael Hardy Aug 20 '17 at 20:18
  • I'm not convinced that $\int_{-\infty}^{\infty} |$the same function$| dx < \infty$ in this case, but the 4th moment is finite, so I think symmetry is enough. And about the $\Gamma (1/2)$: that post was useful. Now I have a simple proof just by substitution of z into (1/2)t^2$ (which is a bit another substitution than what I've seen at that post). – Rocco van Vreumingen Aug 20 '17 at 20:51
  • For a real-valued random variable $X,$ if $\e(X^4) < \infty$ then $\e(|X^3|) <\infty,$ so that plus symmetry about $0$ is enough to conclude $\e(X^3)=0.$ By symmetry about $0$ I mean that for every measurable set $A$ we have $\Pr(X\in A) = \Pr(X\in -A). \qquad$ – Michael Hardy Aug 20 '17 at 21:11
  • Yes, this is what I meant and why I was convinced about the symmetry argument. – Rocco van Vreumingen Aug 20 '17 at 21:24
  • So I was not convinced because $\int | $same function$| <\infty$ directly, but because of the argument you mentioned now. – Rocco van Vreumingen Aug 20 '17 at 21:30
  • So now we have a system of 4 equations. Is it doable to solve this by hand? I think this is difficult because all the variables are in each of the equations except one time for $\sigma$, it's not linear, and guessing the value $\mu = \overline{X}/(2p)$ and $\nu = \overline{X}/(2(1-p))$ is also not a very good idea. – Rocco van Vreumingen Aug 23 '17 at 13:51
  • First let's look at the system of four equations in all its splendor: $$\begin{align} p\mu + (1-p) \nu & = \text{first moment} \ \ p(\mu^2 + 1) + (1-p)(\nu^2 + \sigma^2) & = \text{second moment} \ \ p(\mu^3+3\mu) + (1-p)(\nu^3 + 3\nu\sigma^2) & = \text{third moment} \ \ p(\mu^4 + 6\mu^2 + 3) + (1-p)(\nu^4 + 6\nu^2\sigma^2 + 3\sigma^4) & = \text{fourth moment} \end{align}$$ – Michael Hardy Aug 23 '17 at 20:47
  • 1
    Yes, this is the system of equations I agree with. Then I noted that the first moment is equal to $\overline{X}$, the second ... and the fourth is equal to $\overline{X^4}$, and these averages with power must be seen as known, whereas the variables $p,\mu,\nu, \sigma$ are unknown. Then I tried writing $\mu$ as function of the other variables, using the first moment, $\sigma$ as function of the others using second moment and so on. I also tried writing second moment with first moment. But I still get ugly notations. – Rocco van Vreumingen Aug 23 '17 at 22:07
  • I really don't know how to solve by hand, but Wolframalpha gave a solution. All the solutions are with $p=-1$, while we know $p\in [0,1]$. But I agree with this system of equations. So the conclusion must be that there is no moment estimator or Wolfram does not give all solutions (if there was no typo). – Rocco van Vreumingen Aug 28 '17 at 14:47
  • You have several constraints here, one of which is $0\le p\le1,$ and another is that the $\sigma^2\ge0.$ And the quantities on the right have to be a sequence of numbers that can occur as the first four sample moments. – Michael Hardy Aug 28 '17 at 18:36
  • Yes, agree, maybe wolframalpha will thus not help. – Rocco van Vreumingen Aug 28 '17 at 21:45