0

I have data consisting of service times which I want to model with the gamma distribution. I want to use the method of moments to estimate the parameters of the gamma distribution.

I get the following theoretical moments: $$ \begin{split} \mathbb{E}[X] &= \frac{r}{\lambda}\\ \mathbb{E}\left[X^2\right] &= \mathbb{Var}[X] + \mathbb{E}[X]^2 = \frac{r}{\lambda^2} + \frac{r^2}{\lambda^2} = \frac{r(r+1)}{\lambda^2} \end{split} $$ Thus I get that $$ \begin{split} r &= \lambda \mathbb{E}[X] \\ \lambda^2 &= \frac{r(r+1)}{\mathbb{E}\left[X^2\right]} \end{split} $$

But when I try to compute the estimators for $\lambda$ and $r$, I get wrong results. What have I done wrong in my derivation of the formula for $\lambda$ above?

gt6989b
  • 54,930
  • What do you mean with "wrong results" ? – Thomas Mar 11 '20 at 12:36
  • @Thomas I get a lambda value that I'm sure is wrong. –  Mar 11 '20 at 12:40
  • Did you try to compare: https://math.stackexchange.com/questions/3104688/method-of-moments-with-a-gamma-distribution ? – Thomas Mar 11 '20 at 12:45
  • One suggestion that I have is trying to extract some synthetic data from a gamma distribution that you know a priori, maybe close to the ones that you expect, and to check if the formulas for the estimator that you apply provide consistent values. – Thomas Mar 11 '20 at 14:15

1 Answers1

1

You have not actually solved the system of two equations in two unknowns. The solution for $r, \lambda$ in terms of the sample raw moments is $$\hat r_{\text{MOM}} = \frac{(\bar x)^2}{\overline{x^2} - (\bar x)^2}, \quad \hat \lambda _{\text{MOM}}= \frac{\bar x}{\overline{x^2} - (\bar x)^2},$$ where $\bar x = \frac{1}{n} \sum_{i=1}^n x_i$ is the sample mean, and $\overline{x^2}= \frac{1}{n} \sum_{i=1}^n x_i^2$ is the mean of the squares (the second sample raw moment).

A simple calculation $$\begin{align*} \sum_{i=1}^n x_i^2 &= \sum_{i=1}^n (x_i - \bar x + \bar x)^2 \\ &= \sum_{i=1}^n \left( (x_i - \bar x)^2 + 2(x_i - \bar x)\bar x + (\bar x)^2 \right) \\ &= \sum_{i=1}^n (x_i - \bar x)^2 + 0 + \sum_{i=1}^n (\bar x)^2 \\ &\ge 0 + 0 + n (\bar x)^2 \end{align*}$$ demonstrates that the denominator for $\hat\lambda_{\text{MOM}}$ is always nonnegative (and strictly positive if there exists $x_i \ne x_j$ for some $i \ne j$ in the sample), and since the support of $X$ is on $[0,\infty)$, as long as $\bar x > 0$, $\hat \lambda_{\text{MOM}}$ is strictly positive.

heropup
  • 143,828