0

Here's a proof that I found which looks pretty simple but I can't understand the last step. (A Markov matrix is a square matrix whose columns sum to one; $I$ is an identity matrix; $M^T$ and $I^T$ refer to the transpose matrices)

enter image description here

Vanessa
  • 489

2 Answers2

2

The determinants of a square matrix and its transpose are identical. This means that their characteristic polynomials are identical, which in turn means that they have the same eigenvalues. When you left-multiply a matrix by a vector, the result is a linear combination of the matrix rows. In particular, left-multiplying by a vector of all $1$s sums the rows of the matrix. Each column of a Markov matrix sums to $1$, therefore $\mathbf 1^TM = \mathbf 1^T$. Transposing, we see that $\mathbf 1$ is an eigenvector of $M^T$ with eigenvalue $1$, therefore $1$ is an eigenvalue of $M$ and there must by definition exist some non-zero vector $\mathbf x$ such that $M\mathbf x=\mathbf x$.

amd
  • 55,082
1

I am not sure about the notations, as you only made a part of the proof available, where the notations are not defined. In fact, I probably use the reverse setup. So long story short, this might not ne a good answer, and there is no way to tell until you make the problem clear.

But if by Markov matrix you mean a non-negative square matrix whose columns (?) add up to 1, then the reason $M^T-I$ has an eigenvector is that the all $1$ vector is trivially a good example.

A. Pongrácz
  • 7,488