1

I need to prove $\sum\limits_{k=0}^{\infty} x^{k} = \frac{1}{1-x}$ also holds for matrices. Thus the previous should be true when $x$ is a matrix. I honestly have no idea where to start, so any suggestions are welcome.

  • 1
    It is not true in general, just like the formula above is not true in general. The usual proof for matrices is not much different than the proof for matrices. – copper.hat Jun 11 '19 at 19:46
  • 2
    So what you want to show is that the inverse of $I-A$ is $I+A+A^2+...$ The usual way to show that something is the inverse of something else is to multiply them together and show you get the identity. – Paul Jun 11 '19 at 19:49
  • 1
    One usually doesn't write $\frac{1}{1 - A}$ for a matrix $A$---instead I'd write the r.h.s. as $(I - A)^{-1}$, where $I$ is the identity matrix. And of course $A^k$ is only defined for square $A$. – Travis Willse Jun 11 '19 at 19:50
  • $(I+A+A^2+\cdots+A^N)(I-A) = I-A^{N+1}.$ You can come up with conditions for $A^n \rightarrow 0$ or conditions for $(I-A)$ to have an inverse (which has been discussed already in the comments). – mjw Jun 11 '19 at 19:53
  • For the reference, you may look up the keyword Neumann series. – Sangchul Lee Jun 11 '19 at 20:56
  • Thanks for all these comments, you are definitely right. I should have done a better job in the question itself. – Mathbeginner Jun 12 '19 at 15:21

2 Answers2

2

I'll give hints, which are similar to what the comments suggest.

To start, we need to suppose that $x$ is square, and $\|x\|<1.$ Call $S_n=I+x+x^2+\cdots+x^n.$ First, we claim that this converges as $n\rightarrow\infty,$ and the series converges absolutely and uniformly. Check why this is true. Call this limit $S$. We can see that $I-x$ and $S_n$ commute. What does their product give us, and what happens in the limit?

cmk
  • 12,611
1

The statement you have written isn't necessarily true. In fact, you will need another condition that $\rho(A) < 1$ (spectral radius of $A$) to allow the series to converge. Here's the idea:

Let

$$S_k = \sum_{n=0}^k A^n$$

Then

$$S_k(I-A) = (I + A + \dots + A^k)(I-A) = I + A + \dots + A^k - (A + A^2 + \dots + A^{k+1}) = I - A^{k+1}$$

Similar one can see that $(I-A)S_k = I-A^{k+1}$. Now since $\rho(A) < 1$ then we have that $I-A$ is invertible and further that $\lim_{k \to \infty} A^k= 0$ thus taking the limit of both sides and define $S := \sum_{n=0}^{\infty} A^n = \lim_{k \to \infty} S_k$ we have that

$$S(I-A) = I$$

and also

$$(I-A)S = I$$

and therefore $S = (I-A)^{-1}$ which we wanted to show. I believe that the converse also holds true. That is,

$$\rho(A) < 1 \leftrightarrow \sum_{n=0}^{\infty} A^n \text{ converges and } \sum_{n=0}^{\infty} A^n = (I-A)^{-1}$$

  • Is it true that since $\rho(A)<1$, this would hold for stochastic matrices? – Mathbeginner Jun 12 '19 at 16:36
  • @Mathbeginner I honestly don't know anything about stochastic matrices at the moment so I am unable to answer your question. Sorry. – carsandpulsars Jun 12 '19 at 18:01
  • @carandpulsars no worries, I appreciate the effort. A stochastic matrix is simple a matrix whose rows (or columns) add up to unity. Hence, the sum all entries in each row sum up to 1. – Mathbeginner Jun 13 '19 at 13:34
  • 1
    @Mathbeginner From a bit of googling it seems that the eigenvalue 1 is always attained and that this is the maximum eigenvalue:

    https://math.stackexchange.com/questions/40320/proof-that-the-largest-eigenvalue-of-a-stochastic-matrix-is-1

    So the $\rho(A) < 1$ conditions does not hold true. As a very degenerate case, consider the 1x1 stochastic matrix $A = [1]$, then the theorem clearly doesn't hold. In fact the identity matrix of any dimension is a stochastic matrix but the theorem still doesn't hold.

    – carsandpulsars Jun 17 '19 at 16:11