Problem : Show that the set of stationary measures for a transition matrix forms a polyhedron with one vertex for each essential communicating class. (source : [1])
My trouble in 1 sentence : couldn't find vertex of polyhedron .
Glossary :
stationary measure : A stationary measure of a transition matrix $P$ is any measure $\mu$ such that $\mu P=\mu $ .
polyhedron : A polyhedron $S$ in $\mathbb{R}^n$ is defined as the solution set of a finite number of linear equalities and inequalities , and thus the intersection of a finite number of halfspaces and hyperplane , or compactly $S=\{x : Ax \preceq b , Cx = d\}$ with $A ,C\in M_{m\times n}(\mathbb{R}) ; x , b , d \in \mathbb{R}^n $ . A vertex of polyhedron can form a supporting hyperplane to $S$ which only intersects $S$ at that vertex .
essential communicating class : For $x,y$ in state space , we denote $x \to y$ iff $(\exists r > 0)(P^r(x,y)> 0 )$ where $P^r(x,y)$ is the $r-$step transition probability from state $x$ to $y$ . We denote $x \leftrightarrow y$ iff $x\to y $ and $y \to x$ . The equivalence classes under $\leftrightarrow$ are called communicating classes . A state $x$ is essential iff for all $y$ such that $x\to y$ , we have $y\to x$ . An essential communicating class contains at least $1$ essential state .
Attempt :
Suppose state space is $\mathcal{X}$ and $|\mathcal{X}| = n < \infty $ . Suppose there're $m$ essential communicating classes $C_i , i=1,...,m$ . Let $A := \mathcal{X} \setminus \bigcup_{i=1}^m C_i$ . Let $P$ be in the form $ \begin{bmatrix} P_{AA} & P_{AC_1} & ... & P_{AC_m} \\ \mathbf{0} & P_{\vert C_1} & ... & \mathbf{0} \\ : & \ddots & \ddots & :\\ \mathbf{0} & \mathbf{0} & ... & P_{\vert C_m} \end{bmatrix} $ (use elementary row operation if needed) where $P_{AC_i}$ describes the transition from states in $A$ to states in $C_i$ , the restricted matrix $P_{\vert C_i}$ is stochastic , irreducible and by [5] recurrent .
The set of stationary measures of $P$ is $S = \{s^T : s \in S' \}$ where $S' = \{ s : s \succeq \mathbf{0} , (P^T-I)s = \mathbf{0} \}$ forms a polyhedron in $\mathbb{R}^n$ .
Then any stationary measure $s^T$ of $P$ will be in the form $s^T = ( \mathbf{0} , o_1\gamma^{k\in C_1} , ... , o_m\gamma^{k\in C_m} )$ where $o_i \in (0,\infty) $ are constants and $\gamma^k$ as defined in [2] .
To see this , let $s^T$ be any stationary measure of $P$ . By [3] , $s^T(A) = \mathbf{0}$ . By definition , $$ s^T(x \in C_i) = \sum_{y \in \mathcal{X}} s^T(y) P(y,x) = \sum_{y \in C_i } s^T(y) P(y,x) = \sum_{y \in C_i } s^T(y) P_{|C_i}(y,x) $$ so the row vector $s^T(C_i) = (s^T(x) : x\in C_i) $ is stationary measure to $P_{|C_i}$ , but [4] states that $s^T(C_i) = o_i \gamma^{k \in C_i} $ for some constant $o_i \in (0,\infty) $ such that $o_i^{-1}$ normalizes $s^T(k) $ .
But I couldn't specify a hyerplane and an element of $S'$ for each $C_i$ that satisfies the vertex definition above .
References :
[1] David and Yuval . Markov Chains & Mixing Times , p. 18 , exercise 1.14 .
[2] J.R. Norris . Markov Chains , theorem 1.7.5 .
For Markov chain $(X_n)_{n\ge 0}$ with state space $I$ and a fixed state $k$ , consider for each $i$ the expected time spent in $i$ between visits to $k$ : $$ \gamma_i^k = \mathbb{E}_k \sum_{n=0}^{T_k-1} 1_{\{X_n = i\}} $$ Here the sum of indicator functions serves to count the number of times $n$ at which $X_n =i$ before the first passage time $T_k := \inf\{ n\ge 1 : X_n = k\}$ .
If in addition , $P$ is irreducible and recurrent . Then
(i) $\gamma_{k}^k = 1$
(ii) $\gamma^k = (\gamma_i^k : i \in I) $ satisfy $\gamma^k P = \gamma ;$
(iii) $0 < \gamma_i^k < \infty $ for all $i \in I$
remark : In Markov Chains & Mixing Times by David and Yuval, p. 11 , $\gamma^k_i = \tilde{\pi}(i) $ with $z = k $ .
[3] David and Yuval . Markov Chains & Mixing Times , p. 16 , proposition 1.28 .
If $\pi$ is stationary for the finite transition matrix $P$ , then $\pi(y_0) = 0$ for all inessential states $y_0$ .
[4] J.R. Norris . Markov Chains , p.36 , Theorem 1.7.6 .
Let $P$ be irreducible and let $\lambda$ be an invariant measure for $P$ with $\lambda_k =1 $ . Then $\lambda \ge \gamma^k$ . If in addition $P$ is recurrent , then $\lambda = \gamma^k$
[5] David and Yuval . Markov Chains & Mixing Times , p. 15 , remark 1.24 .
For finite chains , a state $x$ is essential iff it's recurrent .