0

Poisson distributions are memoryless, in particular: $$P(X_j>t+s|X_j>s)=P(X_j>t)=e^{-\lambda t}$$ where $X_j$ is the interarrival time. In particular, this implies that $E[X_j|X_j>s] = 1/\lambda$.

Now lets say an inspector uniformly randomly arrives. Model this with an rv $S\sim U(0,n)$ where $n$ is the time event $j+1$ occurs. Assume they arrive between events $j$ and $j+1$, and rescale so that $s$ is the time after event $j$. Then by memorylessness, $E[X_{j+1}|X_{j+1}>s]=1/\lambda$. And since $P(s\cap X_{j+1}>s)=e^{-\lambda s} \,ds$, the expected time already elapsed since $j$, $W$, is given by

$$E[W]=\int_0^\infty s\lambda e^{-\lambda s}\,ds = 1/\lambda$$

This implies that $E[X_{j+1}] = E[X_{j+1}|X_{j+1}>s] + E[W]= 2/\lambda$. Is this correct, some of my logic seems suspect...

Tejas Rao
  • 2,075
  • Unfortunately, $U(0,\infty)$ is an impossible distribution. – Snoop Dec 02 '24 at 20:49
  • You are correct of course -- fixed. – Tejas Rao Dec 02 '24 at 21:36
  • "Model this with an rv $S\sim\mathcal U(0,n)$ where $n$ is the time event $j+1$ occurs." This is a conditional distribution, and because $n=\sum_{k=1}^{j+1} X_k$ . then, $X_{j+1}$ is not exponentially distributed under this condition. You are then further conditioning on the event $S>n-X_{j+1}$ (that there are exactly $j$ arrivals until time $S$. – Graham Kemp Dec 03 '24 at 01:17
  • 1
    The inspector can arrive at any time independent of the Poisson process to get your result of $\frac2\lambda$, except that to get that result the inspector must arrive after the first event and so the arrival time is not independent of the process. You could have the inspector arriving at some arbitrary time $s \gg \frac1 \lambda$ and get an expectation of almost $\frac2\lambda$. – Henry Dec 03 '24 at 01:56

1 Answers1

0

some of my logic seems suspect...

Indeed. For example:

When you have a condition of knowing $n$, the time when arrival $j+1$ occurs, then $X_{j+1}$, the interarrival time between events $j$ and $j+1$, cannot be exponentially distributed. Among other things, that condition prohibits it from exceeding $n$.

Let $T_k = \sum_{i=1}^k X_i$.

Since the sum of a sequence of iid exponential distributions has an Erlang distribution, we evaluate:

$$\begin{align}f_{X_{j+1}\mid T_{j+1}}(t\mid n) &= \dfrac{f_{X_j}(t)\,f_{T_j}(n-t)}{f_{T_{j+1}}(n)}\\ &= \dfrac{\lambda\mathrm e^{-\lambda t}}{}\dfrac{\lambda^j(n-t)^{j-1}\mathrm e^{-\lambda(n-t)}}{\Gamma(j)}\dfrac{\Gamma(j+1)}{\lambda^{j+1}n^j\mathrm e^{-\lambda n}} \mathbf 1_{t\in(0,n)}\\&=\dfrac{j(n-t)^{j-1}}{n^{j}}\mathbf 1_{t\in(0,n)}\end{align}$$

Thus, $X_{j+1}$, when conditioned on $T_{j+1}=n$, is not a memoryless distribution.

Graham Kemp
  • 133,231