0

Suppose we have a finite state discrete time homogeneous Markov chain. Given this, if the Markov chain is irreducible and aperiodic then it is ergodic. Furthermore, we know that a regular Markov chain is always ergodic but the converse is not necessarily true. Does anyone have an example of this - one where the Markov matrix is irreducible and aperiodic but not regular.

snoopy
  • 103
  • 1
  • But an ergodic Markov matrix does not necessarily imply the matrix is regular; so the iff in the above post doesn't follow – snoopy Nov 20 '24 at 19:04
  • 1
    You should define these terms and give sources for statements. If a $n\times n$ stochastic matrix $P$ is irreducible and aperiodic then it tends to a limit which is a positive matrix hence $P^k \gt \mathbf 0$ for all $k$ large enough by definition of a limit. I am guessing that meets your definition of regular. – user8675309 Nov 20 '24 at 20:29

1 Answers1

3

If the Markov chain is irreducible, it is possible (with nonzero probability) to get from any state $i$ to any other state $j$. If it is aperiodic, there is some state $k$ such that the GCD of the positive integers $n$ such that it is possible to get from $k$ to $k$ in $n$ steps is $1$. Then for some $N$, it is possible to get from $k$ to $k$ in $n$ steps for all $n \ge N$. Now if $M$ is the number of states, for any $i$ and $j$ it is possible to get from $i$ to $k$ in at most $M$ steps and from $k$ to $j$ in at most $M$ steps, so it is possible to get from $i$ to $j$ in $2M+n$ steps for all $n \ge N$. That is, if $T$ is the transition matrix, $T^n$ has all elements positive if $n \ge 2M+N$, so the Markov chain is regular.

Robert Israel
  • 470,583