6

I have a certain matrix $$M:=\begin{bmatrix} 0&-2&0&0&0\\ 1&0&-4&0&0\\ 0&1&0&-6&0\\ 0&0&1&0&-8\\ 0&0&0&1&0\\ &&&&&\ddots\end{bmatrix}$$ where I would like to find a closed form for $M^n$. (The reason is a bit convoluted; but I'm mostly concerned with the first column and possibly other columns of $M^n$ - it encodes a certain sequence I'm studying).

I realise that an 'infinite dimensional matrix' doesn't make much sense; though I note that the first column of $M^k$ is the same as the first column of $M_{k+1}^k$ (where the subscript $M_k$ denotes truncation to a $k$x$k$ matrix). Hence I'm kind of looking at the limit of $M_{n+1}^n$, if that makes sense. I've not studied any sort of functional analysis either; only linear algebra.

I haven't made much progress so far. I believe diagonalisation is off the table;

  • the eigenvalues diverge, going off of each truncation. All of them are purely imaginary, except one real eigenvalue (0) for odd truncations.
  • the characteristic polynomials converge almost nowhere (the coefficients grow something like $(2x)!/x!$)
  • I haven't noticed any pattern between the eigenvectors via computing them. There may be some by inspection but I haven't found any.

I feel the divergence may be to do with the elements in the matrix themselves diverging; I've seen other questions (see references) with similar 'infinite matrices' that can be diagonalised/have a convergent characteristic polynomial; in those, the terms were finite.

References:

If $M$ itself isn't diagonalisable, there are some other infinite matrices I would like to treat similarly. If there is anything I should study; any other way I could encode these sequences; or anything else, do comment. Thanks in advance!

Habeeb M
  • 337
  • 1
  • 10

1 Answers1

9

I'm assuming that $1$ at the end is misplaced by one place to the right.

Infinite-dimensional matrices make perfect sense; they are (with some caveats) matrix representations of linear operators on infinite-dimensional vector spaces. In this particular case $M$ is the matrix of the differential operator $x - 2 \partial$ acting on the vector space of polynomials $K[x]$ ($K$ can be any field of characteristic $0$ here, say $K = \mathbb{Q}$), expressed with respect to the basis $\{ 1, x, x^2, \dots \}$. $x - 2 \partial$ is an element of the Weyl algebra $K[x, \partial]$.

We can compute the powers of $x - 2 \partial$ by organizing them into a generating function

$$F(t) = e^{(x - 2 \partial) t} = \sum_{n \ge 0} (x - 2 \partial)^n \frac{t^n}{n!}.$$

This is a slightly unusual generating function; it has coefficients living in the Weyl algebra, and $t$ commutes with everything. Using the commutation relation $[\partial, x] = \partial x - x \partial = 1$ and the Zassenhaus formula gives

$$F(t) = e^{(x - 2 \partial) t} = e^{xt} e^{-2 \partial t} e^{-t^2}$$

(there are three negative signs determining the third factor; we have $[xt, -2 \partial t] = - 2t^2 [x, \partial] = 2t^2$, but then we need to take the negative of this in the Zassenhaus formula), which gives, by comparing coefficients of $t^n$,

$$\boxed{ (x - 2 \partial)^n = \sum_{i+j+2k=n} \frac{n!}{i! j! k!} x^i (-2 \partial)^j (-1)^k }.$$

As a sanity check, for $n = 2$ we have

$$\begin{align*} (x - 2 \partial)^2 &= x^2 - 2 x \partial - 2 \partial x + 4 \partial^2 \\ &= x^2 - 2x \partial - 2 (x \partial + 1) + 4 \partial^2 \\ &= x^2 - 4 x \partial + 4 \partial^2 - 2 \end{align*}$$

and, comparing to the above expansion, the first three terms correspond to $k = 0$ (what the answer would be if $x$ and $\partial$ commuted) while the final term corresponds to $k = 1$ and corrects for the noncommutativity.

Qiaochu Yuan
  • 468,795
  • Thank you! This approach looks very promising; though I’m not very familiar with most of these topics. Are there any books or articles you’d recommend for me to read, to learn this? – Habeeb M Sep 29 '24 at 13:26
  • 1
    @Habeeb: unfortunately I'm not aware of any relevant references. Feel free to ask follow-up questions in a new question. I will say if you only need the first column then you are applying $(x - 2 \partial)^n$ to $1 \in K[x]$, which lets you ignore all the $j \neq 0$ terms in the above sum. – Qiaochu Yuan Sep 29 '24 at 23:48
  • I took some time to digest this, and it seems to be making sense! – Habeeb M Sep 30 '24 at 11:51
  • I wonder if there’s a similar algebra with a sort of integration operator as well? I guess it couldn’t be too hard to define one.. – Habeeb M Sep 30 '24 at 12:01
  • 1
    @Habeeb: you could define one but I'm not aware of this being used for anything. One comment is you have to pick somewhere to start the integration from so unlike differentiation it isn't equivariant wrt translation. – Qiaochu Yuan Oct 01 '24 at 05:42
  • Actually, I guess a more general question; does there exist any notion of inverting these matrices? My first thought is no; I'd assume the inverse operators we'd have to define would be $1 / x$ and $\int_{0 \text{ or } 1}^x$, but those don't play very nicely in our basis (we could take $x \mapsto 1 / x \mapsto \log$) and from there it would likely get a lot messier. – Habeeb M Oct 01 '24 at 21:43
  • 1
    @Habeeb: the non-constant elements of the Weyl algebra are not invertible inside the Weyl algebra. $x - 2 \partial$ is not invertible inside the algebra of linear operators on $K[x]$ because it raises degree so is not surjective. It is also not invertible when acting on, say, the space of smooth functions $\mathbb{R} \to \mathbb{R}$ because it annihilates $e^{\frac{x^2}{4}}$ so is not injective. But, for example, $1 + \partial$ is invertible inside linear operators on $K[x]$, with inverse $1 - \partial + \partial^2 \mp \dots $ (which is well-defined because $\partial$ is locally nilpotent). – Qiaochu Yuan Oct 02 '24 at 05:52