1

Suppose that

$$Z=\dfrac{X}{X+Y}$$

$$X \sim Gamma(a,\lambda)$$ $$ Y \sim Gamma(b,\lambda)$$

with $X$ and $Y$ independent. I would like to see if it might be possible to determine the distribution of $Z$ without resorting to Jacobians. Is it possible to do a proof by representation or by way of moment generating functions? For example, I tried using iterated expectations over

$$ \mathbb{E}\left(e^{tZ}\right) = \mathbb{E}\left(e^{t\cdot\frac{X}{X+Y}}\right) $$

where

\begin{align*} \mathbb{E}\left(\mathbb{E}\left(e^{t\cdot\frac{X}{X+Y}}\right)\bigl\vert X+Y=c\right) &= \mathbb{E}\left(\mathbb{E}\left(e^{t\cdot\frac{c-Y}{c}}\right)\right) \\ &=\mathbb{E}\left(\mathbb{E}\left(e^{t\cdot \left(1-\frac{Y}{c}\right)}\right)\right) \\ &= e^t \cdot \mathbb{E}\left(\mathbb{E}\left(e^{t \left(-\frac{Y}{c}\right)}\right)\right) \\ \end{align*}

which seems to be saying I need to find the MGF of $-\frac{Y}{c}$. Since $Y \sim Gamma(b, \lambda)$, we have that $\dfrac{Y}{c} \sim Gamma\left(b, \frac{\lambda}{c}\right)$. With the negative on $\dfrac{Y}{c}$, I am unsure how this would work. Should I instead condition on $X$ instead of $X+Y$? Or do MFGS just not work well?

StubbornAtom
  • 17,932
user321627
  • 2,714
  • This page has useful information just above the "Approximate (Limit) Relationships" – RobertTheTutor May 13 '21 at 03:09
  • You can probably go with MGF. However, there's easier ways to do this, even without Jacobian. You can find a few ways here : https://math.stackexchange.com/questions/412615/x-y-are-independent-exponentially-distributed-then-what-is-the-distribution-of-x . – JRC May 13 '21 at 03:20
  • This sort of iteration of expectation operators makes sense when you're conditioning on a random variable, but not when you're conditioning on an event. – Michael Hardy May 13 '21 at 03:31
  • If you know $\psi(s)= \operatorname E(e^{sU})$ as a function of $s,$ and you know $\Pr(U\ge0)=1,$ then the moment-generating function of $-U$ is $s\mapsto \operatorname E(e^{s(-U)}) = \operatorname E(e^{(-s)U})$ and you have the same function evaluated at $-s$ that you formerly evaluated at $s.$ $\qquad$ – Michael Hardy May 13 '21 at 04:32
  • @Kolmogorov The link contains examples for the exponential distribution. Are you saying it can be extended to the Gamma case I have? – user321627 May 13 '21 at 04:32
  • 1
    $$\require{cancel} \begin{align} & \xcancel{ \operatorname E\left( \operatorname E\left( e^{t\cdot\frac X{X+Y}}\right),\Big\vert,X+Y \right)} \ {} \ & \operatorname E\left( \operatorname E\left( e^{t\cdot\frac X{X+Y}},\Big\vert,X+Y \right) \right) \end{align} $$ – Michael Hardy May 13 '21 at 04:33
  • The conditional expectation $\operatorname E\left( e^{t\cdot\frac X{X+Y}} ,\big\vert, X+Y \right)$ is a random variable that is a function of $X+Y,$ and the expected value of that is the OUTER expectation. If instead of $\operatorname E\left( e^{t\cdot\frac X{X+Y}} ,\Big\vert, X+Y \right)$ you have $\operatorname E\left( e^{t\cdot\frac X{X+Y}} ,\Big\vert, X+Y=c \right)$ then you get a function of $c,$ a constant, and that is a constant rather than properly a random variable. $\qquad$ – Michael Hardy May 13 '21 at 04:34
  • If you find $\operatorname E\left( e^{t\cdot\frac X{X+Y}} ,\Big\vert, X+Y=c \right)$ as a function of $c,$ then evaluate that same function at $X+Y$ instead of $c,$ then maybe you'll get somewhere. – Michael Hardy May 13 '21 at 04:37
  • Assuming $a$ is the shape parameter, you are better off finding the mgf of $\ln Z$ (which is just the raw moment of $Z$) and match the moments with that of a Beta distribution. This is allowed because $Z$ is bounded. On the other hand, mgf of a Beta distribution is not tractable. – StubbornAtom May 13 '21 at 06:47

1 Answers1

3

The joint density is $$ \text{constant} \times x^{a-1} e^{-\lambda x} \cdot y^{\beta-1} e^{-\lambda y} $$

Now put $c-y$ in place of $x$: \begin{align} & \text{constant}\times (c-y)^{a-1} y^{b-1} e^{-\lambda(c-y)} e^{-\lambda y} \\[8pt] = {} & \text{constant} \times \left( 1 - \frac y c \right)^{a-1} \left( \frac y c \right)^{b-1} \\[8pt] = {} & \text{constant} \times (1-u)^{a-1} u^{b-1} \end{align} (The "constants" are not all equal to each other.)

  • Can you elaborate on the $c-y = x$ part? Am I right to say that the first equation is the joint, $f(x,y)$, while the second fixes $x$ and $y$ as $x+y=c$? Intuitively it seems like the Jacobian form is used, but I can't quite see it. It seems like it went from $f_{X,Y}(x,y)$ to $f_U(u) \sim Beta$ where $U=\frac{X}{X+Y}$. Thank you! – user321627 May 13 '21 at 05:55
  • 1
    @user321627 : The Jacobian is constant on the line $x+y=c,$ i.e. it is the same at all points on that line. It becomes part of the normalizing constant. – Michael Hardy May 13 '21 at 23:21
  • 1
    @user321627 : I haven't fully decided what the best way is to elaborate on this point in this kind of context. But a couple of things are relevant. Remember that $$ \int_0^1 x^{\alpha-1} (1-x)^{\beta-1}, dx = \frac{\Gamma(\alpha)\Gamma(\beta)}{\Gamma(\alpha+\beta)}. $$ And exercise will tell you that once you've got that you find that $$ \int_0^c x^{\alpha-1} (c-x)^{\beta-1}, dx = c^{\alpha+\beta-1} \cdot \frac{\Gamma(\alpha)\Gamma(\beta)}{\Gamma(\alpha+\beta)}. $$ And$,\ldots\qquad$ – Michael Hardy May 15 '21 at 16:26
  • 1
    $\ldots,$recall that $$ f_{Y,\mid,X=x} (y) = \text{constant} \times f_{X,Y}(x,y) $$ where "constant" means not depending on $y.$ This is similar to that, but the line is diagonal. $\qquad$ – Michael Hardy May 15 '21 at 16:28