0

Consider a probability space $(\Omega,\mathscr{F},\mathbb{P})$ which supports a Poisson process $N$. Let $T_1$ and $T_2$ be the first two arrival times from $N$, while $\xi_2$ is the first inter-arrival time i.e. $\xi_2:=T_2-T_1$. We know that the joing density of $T_1,T_2$ satisfies: $$f_{T_1,T_2}(t_1,t_2)=f_{T_1,\xi_2}(t_1,t_2-t_1)$$ I am trying to formally prove this property. While this has been discussed in other posts such as here or here, I haven't seen a formal proof of this fact (the latter post states to prove this, it is necessary to "go through the change of variable formalism with the Jacobean"). Maybe I am missing some basic fact about continuous distributions admitting densities that makes this immediate?

My try is as follows. By the definition of conditional probabilities: $$f_{T_1,T_2}(t_1,t_2)=f_{T_1}(t_1)f_{T_2|T_1}(t_2|t_1)$$ Using the law of total probability by conditioning on $\xi_2$: $$f_{T_2|T_1}(t_2|t_1)=\int_{x\geq0} f_{T_2|T_1,\xi_2}(t_2|t_1,x)f_{\xi_2}(x)\text{d}x$$ Yet the law of $T_2|T_1,\xi_2$ is degenerate such that $\mathbb{P}(T_2\leq t_2|T_1=t_1,\xi_2=x)=\mathbf{1}_{[t_1+x,\infty)}(t_2)$ hence: $$f_{T_2|T_1}(t_2|t_1)=\int_{x\geq0} \delta(x-(t_2-t_1))f_{\xi_2}(x)\text{d}x=f_{\xi_2}(t_2-t_1)$$ where $\delta(\cdot)$ is the Dirac delta function. Thus we conclude using independence: $$f_{T_1,T_2}(t_1,t_2)=f_{T_1}(t_1)f_{\xi_2}(t_2-t_1)=f_{T_1,\xi_2}(t_1,t_2-t_1)$$

Is the above proof adequate? Is there another way to reach the same conclusion?

Ultimately I am trying to generalize to the joint distribution from $T_1,\dots,T_n$, that is I am seeking a method which lends itself nicely to the general case.

  • What are you assuming for the definition of Poisson process? Depending on the definition, it might be that the desired statement is nearly immediate, or it might be that your proof might not be enough. – Ziv Apr 29 '24 at 14:55
  • @Ziv good point, I hadn't thought about it that way. For my purposes, I take as granted that Poisson increments are independent; that for a given $t$ the value of the process follows a Poisson distribution; and that the inter-arrival times follow an exponential distribution - I am not really trying a proof from very first principles, if you understand me. I feel there is some very basic fact about independence and conditional densities that I am missing and is preventing me to immediately grasp that $f_{T_1,T_2}(a,b)=f_{T_1,T_2-T_1}(a,b-a)$ (even though intuitively it makes sense)... – Morris Fletcher Apr 29 '24 at 17:34
  • @Ziv I refer back to the second link where the answerer stated "to go through the change of variable formalism with the Jacobean" so there seems to be some formal proof - however not sure what the poster meant and they haven't checked their account in a while. – Morris Fletcher Apr 29 '24 at 17:37

2 Answers2

1

Thanks for clarifying the question in the comments. The intuition you have would be enough if we were talking about probabilities of events. But because we're talking about probability densities, we need to be more careful. This is where the Jacobean comes in.

With the "exponential interarrival times" definition of Poisson processes, $T_2 = T_1 + \xi_2$ by definition. (In fact, this is true of any renewal process, not just Poisson processes.) This means, roughly speaking, that $$ f_{T_1, T_2}(x, y) \, \mathrm{d}x \, \mathrm{d}y = f_{T_1, \xi_2}(x', y') \, \mathrm{d}x' \, \mathrm{d}y', $$ where $x = x'$ and $y' = y - x$. So you need to relate $\mathrm{d}x \, \mathrm{d}y$ to $\mathrm{d}x' \, \mathrm{d}y'$. In particular, you should find these are equal. Making this formal involves looking at the Jacobean of the function mapping $(x, y)$ to $(x', y')$.

Is your existing argument enough? Your existing argument can likely be viewed as doing the change of variables, but in a way that implicitly assumes (or perhaps implicitly shows?) the Jacobean's determinant is $1$ everywhere. I believe the relevant step is when you introduce the Dirac delta function.

Ziv
  • 1,279
0

Proof 1

We define the random vector $(T_1,T_2)$ as a function of the inter-arrival times $(\xi_1,\xi_2)$: $$(T_1,T_2)=(g(\xi_1,\xi_2),h(\xi_1,\xi_2))$$ where $g(x,y)=x$ and $h(x,y)=x+y$. Note that $(g,h)$ is bijective and its inverse is defined through: $$u(t,s):=g^{-1}(s,t)=s,\qquad v(s,t):=h^{-1}(s,t)=t-s$$ Hence the Jacobian of the inverse $J$ is equal to: $$J =\begin{bmatrix} \frac{\partial u}{\partial s}& \frac{\partial u}{\partial t}\\ \frac{\partial v}{\partial s}& \frac{\partial v}{\partial t} \end{bmatrix} =\begin{bmatrix} 1 & 0 \\ -1 & 1 \end{bmatrix} $$ Hence $|J|=1$. The joint density of $(T_1,T_2)$ is then equal to: $$f_{T_1,T_2}(t_1,t_2) =|J|f_{\xi_1,\xi_2}(u(t_1,t_2),v(t_1,t_2)) =f_{\xi_1,\xi_2}(t_1,t_2-t_1) =f_{T_1,\xi_2}(t_1,t_2-t_1)$$ For more details see the Wikipedia entry on change of variables in the density function for vector-valued random variables.

Proof 2

I constructed a proof which uses the definition of the joint density function as the iterated derivative of the joint cumulative function.

Let $t_1\leq t_2$ then the joint distribution of $T_1,T_2$ is given by: \begin{align} \mathbb{P}(T_1\leq t_1,T_2\leq t_2) &=\int_0^{t_1}\left(\int_0^{t_2} f_{T_1,T_2}(x,y)\text{d}y\right)\text{d}x\\ &=\int_0^{t_1}\left(\int_{x}^{t_2} f_{T_1,T_2}(x,y)\text{d}y\right)\text{d}x\\ &=\int_0^{t_1}f_{T_1}(x)\left(\int_{x}^{t_2} f_{T_2|T_1}(y|x)\text{d}y\right)\text{d}x\tag{1}\\ &=\int_0^{t_1}f_{T_1}(x)\left(\mathbb{P}(T_2\leq t_2|T_1=x)-\mathbb{P}(T_2\leq x|T_1=x)\right)\text{d}x\tag{2}\\ &=\int_0^{t_1}f_{T_1}(x)\mathbb{P}(\xi_2\leq t_2-x|T_1=x)\text{d}x\\ &=\int_0^{t_1}f_{T_1}(x)\mathbb{P}(\xi_2\leq t_2-x)\text{d}x\tag{3}\\ \end{align} where we have used $(1)$ the definition of conditional densities; $(2)$ the definition of conditional probability w.r.t. to an event of zero probability (see below); and $(3)$ independence between $T_1$ and $\xi_2=T_2-T_1$. Now differentiating w.r.t. $t_1$ then $t_2$: \begin{align} &\frac{\partial\mathbb{P}}{\partial t_1}(T_1\leq t_1,T_2\leq t_2) =f_{T_1}(t_1)\mathbb{P}(\xi_2\leq t_2-t_1) \\[3pt] &\frac{\partial^2\mathbb{P}}{\partial t_1\partial t_2}(T_1\leq t_1,T_2\leq t_2) =f_{T_1}(t_1)f_{\xi_2}(t_2-t_1) \end{align} Hence by independence: $$f_{T_1,T_2}(t_1,t_2) =\frac{\partial^2\mathbb{P}}{\partial t_1\partial t_2}(T_1\leq t_1,T_2\leq t_2) =f_{T_1}(t_1)f_{\xi_2}(t_2-t_1) =f_{T_1,\xi_2}(t_1,t_2-t_1)$$

Note: the equality $(2)$ is justified, despite the measure of the event $B:=\{\omega:T_1=x\}$ being zero, because we can find a sequence of events $B_1,B_2,\dots$ such that $\mathbb{P}(B_n)>0$ and $B_n\rightarrow B$, for example: $$B_n:=\left\{\omega:T_1\in\left[x-\frac{1}{n},x\right]\right\}$$ For more details on conditioning on a single value from a continuous random variable, there is this Stack Exchange question or this Cross Validated discussion, as well as the Wikipedia entries on conditioning on an event of probability zero; the limiting procedure to define conditional probabilities; and regular conditional probabilities.