4

Let $\{X_n\}_{n=1}^\infty$ be a real-valued stationary stochastic process, and let $\{W_n\}_{n=1}^\infty$ be a binary-valued stochastic process, where $W_n \in \{0, 1\}$. We call $W_n$ the event selection process, as $W_n = 1$ marks the occurrence of events.

Define $K_i$ as the $i$-th time when $W_n = 1$, i.e., $K_i$ is the time when the $i$-th key event occurs: $$ K_i = n \iff W_1+W_2+\cdots+W_n = i, W_n=1 $$ The pair $(\{K_i\},\{X_n\})$ is now an example of a marked point process.

Let $Y_i = X_{K_i}$, i.e., $Y_i$ is the value of the process $\{X_n\}$ at the time when the $i$-th event occurs.

Problem Statement: Show that the sequence $\{Y_i\}_{i=1}^\infty$ converges in distribution as $i \rightarrow \infty$ given that $\{X_n\}$ and $\{W_n\}$ are jointly stationary.

Eventually I'm looking for some conditions (imposed on $X_n$, $W_n$ and $K_i$) that imply that the sequence $\{Y_i\}_{i=1}^\infty$ is ergodic, i.e. that $$\lim_{N\rightarrow\infty}\frac{1}{N}\sum_{i=1}^N g(Y_i) = g(Y_\infty)$$ where $Y_\infty$ is a random variable toward which $Y_i$ converge and $g$ is some nice enough function. But the convergence of $Y_i$ and the existence $Y_\infty$ is needed first.

My attempts at a solution: I've shown the following less general results

  • if $\{W_n\}$ and $\{X_n\}$ are independent and $X_n$ is stationary, then $Y_i$ are identically distributed.
  • if $X_n$ are i.i.d. and $W_n$ depends on them through $W_n=\mathbb{I}_{X_n\in B}$ where $\mathbb{I}$ is the indicator function and $B$ is an arbitrary set (measurable in the underlying probability space), then then $Y_i$ are identically distributed

Obviously $Y_i$ being identically distributed implies that they converge (in distribution).

My approach at tackling the more general case involves looking at the probability $$P(Y_i\in A) = \sum_{n=1}^\infty P(X_n\in A, K_i=n)\\ = \sum_{n=1}^\infty P(X_n\in A, W_1+\cdots+W_{n-1}=i-1, W_n=1)$$ and applying some time shift (warranted by the joint stationarity of $X_n$ and $W_n$) to form some kind of recurrence relation. With this recurrence I would then be able to show that $P(Y_i\in A)$ converges as $i\rightarrow\infty$. However, I have trouble constructing such a recurrence relation and find myself quite stuck.

2 Answers2

2

Here is a counter-example to show that $Y_i$ does not converge in distribution.

Form $\{(X_i,W_i)\}_{i=1}^{\infty}$ according to a 3 state discrete time Markov chain where states are traversed sequentially with period 3: $$S=\{(4,0), (3,1), (2,1)\}$$ $$ (4,0)\overset{1}{\rightarrow} (3,1)\overset{1}{\rightarrow} (2,1)\overset{1}{\rightarrow} (4,0)$$ where the above transitions occur with prob 1. Uniformize the initial state $(X_1,W_1)$ equally likely over the 3 states of $S$. Now $\{(X_i,W_i)\}_{i=1}^{\infty}$ is a stationary process that is also 3-periodic. The $\{Y_i\}$ process alternates between values 3 and 2. We have $$\{Y_1=3\}\iff \{(X_1,W_1)\in\{(4,0), (3,1)\}\}$$ So

  • With prob $2/3$: $\{Y_i\}_{i=1}^{\infty}=\{3, 2, 3, 2, 3, 2, 3, 2, ...\}$

  • With prob $1/3$: $\{Y_i\}_{i=1}^{\infty}=\{2, 3, 2, 3, 2, 3,2,3, ...\}$

Thus for all $k \in \{1, 2, 3, ...\}$ we have \begin{align} &P[Y_{2k}=2]=2/3\\ &P[Y_{2k-1}=2]=1/3 \end{align} So $Y_i$ does not converge in distribution.

Michael
  • 26,378
2

Regarding time averages, suppose:

  • $\{(X_i,W_i)\}_{i=1}^{\infty}$ are identically distributed random vectors in $\mathbb{R}^2$
  • $\{W_i\}$ are binary valued
  • $E[W_1]>0$
  • $g:\mathbb{R}\rightarrow\mathbb{R}$ is a bounded and measurable function

As in the posted question, for each positive integer $k$ define $Y_k$ as the value of $X_i$ for the index $i$ that is the $k$th time that $W_i=1$ (define $Y_k=0$ if there is no $k$th time that $W_i=1$).

Claim: If there are constants $q,r$ such that with probability 1 we have \begin{align} &\lim_{n\rightarrow\infty} \frac{1}{n}\sum_{i=1}^n W_i = q\\ &\lim_{n\rightarrow\infty} \frac{1}{n}\sum_{i=1}^nW_ig(X_i)=r \end{align} Then $q=E[W_1]$, $r=E[W_1g(X_1)]$, and with probability 1 $$\lim_{k\rightarrow\infty} \frac{1}{k}\sum_{i=1}^kg(Y_i)=\frac{E[W_1g(W_1)]}{E[W_1]}$$

Proof: Since $\{W_i\}$ and $\{W_ig(X_i)\}$ are bounded random processes we have from the bounded convergence theorem \begin{align} q &= E\left[\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=1}^nW_i\right]=\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=1}^nE[W_i]=E[W_1]\\ r&= E\left[\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=1}^nW_ig(X_i)\right]=\lim_{n\rightarrow\infty}\frac{1}{n}\sum_{i=1}^nE[W_ig(X_i)]=E[W_1g(W_1)] \end{align} where the above uses the fact that $\{W_i\}$ are identically distributed so $E[W_i]=E[W_1]$ for all $i$, and $\{W_ig(X_i)\}$ are identically distributed so $E[W_ig(X_i)]=E[W_1g(X_1)]$ for all $i$.

We know $q=E[W_1]>0$, so (since we have assumed the time average of $W_i$ converges to $q$) we have $W_i=1$ infinitely often with probability 1. Then with probability 1: \begin{align} \frac{r}{q} &= \lim_{n\rightarrow\infty}\frac{\frac{1}{n}\sum_{i=1}^nW_ig(X_i)}{\frac{1}{n}\sum_{i=1}^nW_i}\\ &= \lim_{n\rightarrow\infty} \frac{\sum_{i=1}^nW_ig(X_i)}{\sum_{i=1}^nW_i}\\ &=\lim_{k\rightarrow\infty}\frac{\sum_{i=1}^kg(Y_i)}{k} \end{align} $\Box$

A counter-example where the conclusion fails even though all bullet assumptions hold (but the limiting constants $q,r$ do not exist) is when $X_1 \sim N(0,1)$ and $\{W_i\}$ are i.i.d. $Bern(1/2)$ and independent of $X_1$. Define $X_i=X_1$ for all $i$. Then $\{(X_i,W_i)\}_{i=1}^{\infty}$ is stationary (hence identically distributed) but (using $g(x)=x$ for all $x\in\mathbb{R}$) we have $Y_i=X_1$ for all $i$ and so $\frac{1}{k}\sum_{i=1}^kY_i\rightarrow X_1$ with probability 1 (and $X_1$ is not a constant).

Michael
  • 26,378