0

Let $X_0=1$ and $X_n = 1 + \sum_{i=1}^n Y_i$, $\ n\geqslant 1$ where the $Y_i$ are i.i.d. with $\mathbb P(Y_1=2)=\mathbb P(Y_1=-1)=\frac12$. Define $$H_0 = \inf\{n>0: X_n = 0\}. $$ Let $\varphi(s) = \mathbb E[s^{H_0}]$ be the generating function of $H_0$. Show that $\varphi$ satisfies $$s\varphi^3 - 2\varphi + s=0.\tag1 $$

Using Mathematica I found that the only real root of $(1)$ is $$ \frac{2^{1/3} \left(\sqrt{81 s^6-96 s^3}-9 s^3\right)^{2/3}+4\cdot 3^{1/3} s}{6^{2/3} s (\sqrt{81 s^6-96 s^3}-9 s^3)^{1/3}}, $$ which clearly would be extremely tedious to compute by hand. Any suggestions for how to prove this?

Math1000
  • 38,041

1 Answers1

3

In words: this is a skewed random walk where you start at state 1 and are interested in the (defective) distribution of waiting times until hitting state $0$. Crucial point: with negative 1 as the only 'left step' there are no issues of overshoots.

You should be able to convince yourself that cubing the generating function gives you the generating function for starting at state 1 with an absorbing state of state -2. Call this $\gamma(s) = \varphi(s)^3 $

The result then comes from "first step analysis" i.e. with probability 1/2 this ends in 1 step, and with probability 1/2 you have one step then a fresh start at state 3 which necssarily has generating function $\gamma(s)$.

so for $s\in[0,1]$
$\varphi(s) = \mathbb E[s^{H_0}] = \mathbb E\big[\mathbb E[s^{H_0}\vert X_1]\big] = \frac{1}{2}\cdot s\cdot 1 + \frac{1}{2}\cdot s \cdot \gamma(s) = \frac{s}{2} + \frac{s}{2}\varphi(s)^3$

This simplifies to
$s\varphi(s)^3 - 2 \cdot \varphi(s) + s = 0$

user8675309
  • 12,193