40

If $X\sim \Gamma(a_1,b)$ and $Y \sim \Gamma(a_2,b)$, I need to prove $X+Y\sim\Gamma(a_1+a_2,b)$ if $X$ and $Y$ are independent.

I am trying to apply formula for independence integral and just trying to multiply the gamma function but stuck ?

user669083
  • 1,189
  • Hint: After multiplying $f_{X_1}(x)$ and $f_{X_2}(z-y)$ and making sure that the limits are correct, you will get an integral for $f_{X_1+Y_2}(z)$ that can be transformed into a Beta function whose value is $B(a_1,a_2) = \frac{\Gamma(a_1)\Gamma(a_2)}{\Gamma(a_1+a_2)}$. – Dilip Sarwate Dec 03 '12 at 16:28

3 Answers3

41

Now that the homework deadline is presumably long past, here is a proof for the case of $b=1$, adapted from an answer of mine on stats.SE, which fleshes out the details of what I said in a comment on the question.

If $X$ and $Y$ are independent continuous random variables, then the probability density function of $Z=X+Y$ is given by the convolution of the probability density functions $f_X(x)$ and $f_Y(y)$ of $X$ and $Y$ respectively. Thus, $$f_{X+Y}(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(z-x)\,\mathrm dx. $$ But when $X$ and $Y$ are nonnegative random variables, $f_X(x) = 0$ when $x < 0$, and for positive number $z$, $f_Y(z-x) = 0$ when $x > z$. Consequently, for $z > 0$, the above integral can be simplified to $$\begin{align} f_{X+Y}(z) &= \int_0^z f_X(x)f_Y(z-x)\,\mathrm dx\\ &=\int_0^z \frac{x^{a_1-1}e^{-x}}{\Gamma(a_1)}\frac{(z-x)^{a_2-1}e^{-(z-x)}}{\Gamma(a_2)}\,\mathrm dx\\ &= e^{-z}\int_0^z \frac{x^{a_1-1}(z-x)^{a_2-1}}{\Gamma(a_1)\Gamma(a_2)}\,\mathrm dx &\scriptstyle{\text{now substitute}}~ x = zt~ \text{and think}\\ &= e^{-z}z^{a_1+a_2-1}\int_0^1 \frac{t^{a_1-1}(1-t)^{a_2-1}}{\Gamma(a_1)\Gamma(a_2)}\,\mathrm dt & \scriptstyle{\text{of Beta}}(a_1,a_2)~\text{random variables}\\ &= \frac{e^{-z}z^{a_1+a_2-1}}{\Gamma(a_1+a_2)} \end{align}$$

Dilip Sarwate
  • 26,411
  • 2
    Dilip, Your response assumes that parameter $B=1$ is the same for both distributions. What would the function for the sum of two independent Gamma distributions be where both sets of parameters $A$ and $B$ are different? Convolute the two functions $z^{A_i-1}\mathrm{Exp}(-B_i z)/\mathrm{Gamma}(A_i)$, where $i$ is index $1$ and $2$. – Tomas Kollen Mar 14 '14 at 21:26
  • 1
    I wonder what the down voter found so objectionable about this answer. – Dilip Sarwate Nov 07 '17 at 03:17
  • 3
    You waited till his homework deadline is presumably long past to answer?! How cruel. – A_for_ Abacus Mar 11 '18 at 20:33
  • 8
    @A_for_Abacus I gave a hint (actually a complete sketch of the answer) about the homework problem 12 minutes after the question was posted by the OP. So, there was no cruelty involved. – Dilip Sarwate Mar 12 '18 at 00:19
  • Hey guys, I'm confusing why the integral started from $0$ to $z$ not $0$to $\infty$? – ToadetteK May 11 '23 at 08:50
  • @ToadetteK Read the two sentences immediately preceding the displayed integral for the explanation of why the lower limit got changed from $-\infty$ to $0$ and the upper limit got changed from $\infty$ to $z$. Else, just break up $\int_{-\infty}^\infty$ into $\int_{-\infty}^\0 + \int_0^z+\int_{z}^\infty$ and explicitly evaluate the first and third integrals in that sum, hopefully remembering what I said in the two sentences: that the integrand has value $0$ when $x < 0$ and also when $x > z$ as well as something from Calculus 101: the integral of $0$ over any interval has value $0$. – Dilip Sarwate May 11 '23 at 13:48
  • @DilipSarwate Thx a lot! I see. – ToadetteK Jun 08 '23 at 09:19
22

It's easier to use Moment Generating Functions to prove that. $$ M(t;\alpha,\beta ) = Ee^{tX} = \int_{0}^{+\infty} e^{tx} f(x;\alpha,\beta)dx = \int_{0}^{+\infty} e^{tx} \frac{\beta^\alpha}{\Gamma(\alpha)} x^{\alpha-1}e^{-\beta x}dx \\ = \frac{\beta^\alpha}{\Gamma(\alpha)} \int_{0}^{+\infty} x^{\alpha-1}e^{-(\beta - t) x}dx = \frac{\beta^\alpha}{\Gamma(\alpha)} \frac{\Gamma(\alpha)}{(\beta - t)^\alpha} = \frac{1}{(1- \frac{t}{\beta})^\alpha} $$ By using the property of independent random variables, we know $$M_{X + Y}(t) = M_{X}(t)M_{Y}(t) $$ So if $X \sim Gamma(\alpha_1,\beta), Y \sim Gamma(\alpha_2,\beta), $ $$M_{X + Y}(t) = \frac{1}{(1- \frac{t}{\beta})^{\alpha_1}} \frac{1}{(1- \frac{t}{\beta})^{\alpha_2}} = \frac{1}{(1- \frac{t}{\beta})^{\alpha_1 + \alpha_2}}$$ You can see the MGF of the product is still in the format of Gamma distribution. Finally we can get $X + Y \sim Gamma(\alpha_1 + \alpha_2, \beta)$

21

You may use a easier method. Consider the moment generating function or probability generating function. $E(e^{(X+Y)t} )=E(e^{Xt}e^{Yt})=E(e^{Xt})E(e^{Yt})$ as they are independent then we can get a moment generating function of a gamma distribution. Then you can find the mean and variance from the Moment generating function

user2345215
  • 16,803
Mathematics
  • 4,621