I'm currently trying to understand (intuitively) what a stationary distribution of a Markov Chain is? In our lecture notes, we're given the following definition:
This was of little benefit to my understanding, so I've tried searching online for a more useful explanation. I then found the following video, which improved my understanding to the extent that I now understand that stationary distributions are to do with looking at what happens to the probabilities at each state within a Markov Chain when time becomes infinitely large. This is still not a sufficient enough understanding of the concept though.
For example, I've been asked to show that $$ \pi_{a} = \left( \frac{2}{5}, \frac{3}{5}, 0, 0, 0 \right) \\ \pi_{b} = \left( 0, 0, 1, 0, 0 \right) \\ \pi_{c} = \left( 0, 0, 0, \frac{3}{5}, \frac{2}{5} \right) $$ are stationary distributions with respect to the Markov Chain with one-step transition martix $$ \mathbf{P} = \left( \begin{array}{ccccc} \frac{1}{2} & \frac{1}{2} & 0 & 0 & 0 \\ \frac{1}{3} & \frac{2}{3} & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & \frac{2}{3} & \frac{1}{3} \\ 0 & 0 & 0 & \frac{1}{2} & \frac{1}{2} \end{array} \right) $$
How would you do this? What is a stationary distribution, with respect to this example?
Also, could someone please confirm that I'm correct in thinking that the notation $p_{ij}$ denotes the probability of the process moving from the state $i$ to the state $j$?
