2

I have an $n \times n$ integer Hankel matrix $A$ that is defined via

$$A_{i,j} = \left(i+j\right)!$$

where $0!=1$ as is standard and both indices $i$ and $j$ start at $0$. Is there an analytic way to determine the eigenvectors for this Hankel matrix?

The context is that this is relevant to a function approximation problem I'm working on. I wish to use the analytic result so that I can determine an appropriate value of some exponentials in the next step to counterbalance the very large value that will appear, and in doing so make this useful.


Edit: The underlying problem was solved, so this specific problem does not need to be solved. Here's the code updated with the polynomials Jair provided:

import numpy as np
def basis_generator(t,order):
    """
    Generate a basis valid over [0,inf), evaluated at points t
Parameters
----------
t : array_like
    Set of positive real points the basis will be generated at. 
order : int
    The maximal order of polynomial to be used in the basis.

Returns
-------
y : array_like
    Orthonormal basis over the points t.
"""
y = np.zeros(shape=(len(t),order+1))
y[:,0] = np.exp(-t/2)
y[:,1] = (1-t)*np.exp(-t/2)
dy = -t*np.exp(-t/2)
for k in range(1,order):
    dy -= dy/(k+1)
    dy -= (t*(y[:,k])/(k+1))
    y[:,k+1] = y[:,k]+dy
return y

#Example 1: dt = 0.00001 t = np.arange(dt,50,dt) y = basis_generator(t,8) print('Condition number: ', np.linalg.cond(y)) print('Inner products: ', dt*(y.T @ y))

#Example 2: dt = 0.0001 t = np.arange(dt,50,dt) y = basis_generator(t,100) f = np.sin(t/5) b = y.T @ f * dt print('Relative squared error: ', (np.linalg.norm(f-(y@b))2)/(np.linalg.norm(f)2))

  • 1
    Besides, as you are mentionning eigenvectors only, does it mean you already know the eigenvalues ? – Jean Marie Jan 31 '25 at 17:52
  • I'd say a good place to start is to take small values of $n$ and consider the eigenvector associated with the largest positive eigenvalue (the dominant eigenvalue), normalized so that it has positive entries that add up to $1$, and see if the distributions you get follow any pattern. – Ben Grossmann Jan 31 '25 at 18:00
  • If by "exponentials" you mean $A^k$ (for integers $k$), then this at least gives you the asymptotic behavior of $A^k$ as $k \to \infty$. – Ben Grossmann Jan 31 '25 at 18:03
  • I can give it a go, if you'd like I can also pop some code in here for what I'm actually using this for? I'm creating an orthonormal basis over $\left[0,\infty \right)$ from combinations of the set of functions $x^k e^{-x}$ so this part is just making sure that the inner products of all the functions are $0$ – Christopher D'Arcy Jan 31 '25 at 18:17
  • @ChristopherD'Arcy I can't guarantee I'll have time to look into it but I do recommend you give that a try. And sure, I'm sure the code could be helpful to me and others. – Ben Grossmann Jan 31 '25 at 18:21
  • @ChristopherD'Arcy You mean orthonormal relative to the inner product $$ \langle f, g \rangle = \int_0^\infty f(x)g(x),dx? $$ – Ben Grossmann Jan 31 '25 at 18:22
  • Here's the code on my personal blog, ignore the prose, I use this more as a diary than anything but the code is pretty straightforward - And yup! Relative to that inner product. I just want to create a basis that has bounds on all the derivatives of all orders. – Christopher D'Arcy Jan 31 '25 at 18:29
  • 1
    The Laguerre polynomials $L_n(x)$ are orthogonal with respect to $\langle p, q \rangle = \int_0^\infty p(x) q(x) e^{-x} , dx$. So, the polynomials $p_n(x) = L_n(2x)$ satisfy $\int_0^\infty p_i(x) e^{-x} p_j(x) e^{-x} , dx = \int_0^\infty L_i(2x)L_j(2x) e^{-2x}, dx = 0$. – Jair Taylor Jan 31 '25 at 18:57
  • Perfect, exactly what I'm looking for, thanks @JairTaylor these work great for what I need. This doesn't answer the original question but it resolves my need. Should I close the question out? – Christopher D'Arcy Jan 31 '25 at 19:08
  • You don't need to close it, I'll try to add an answer later when I have time. – Jair Taylor Jan 31 '25 at 19:14
  • 1
    @JeanMarie He said "I do not have the eigenvalues, but they are also not important to me." – Ben Grossmann Feb 01 '25 at 15:54
  • 1
    @Ben Grossmann You are perfectly right. How is it possible that I hadn't seen it ... – Jean Marie Feb 01 '25 at 17:04
  • @JeanMarie I edited the question to include that statement after you asked because it was a good question. It didn't say before you asked – Christopher D'Arcy Feb 01 '25 at 17:50

1 Answers1

3

This is not a complete answer, since it does not compute the eigenvectors of the matrix $A$, but it answers a question from the comments.

The Laguerre polynomials $L_n$ defined by

$$L_n(x) = \sum_{k=0}^n \frac{(-1)^k}{k!} {n \choose k} x^k$$

are orthonormal with respect to the measure $e^{-x} d\mu(x)$ on $[0, \infty)$, meaning that $$\int_0^\infty e^{-x} L_i(x) L_j(x) \, dx = \delta_{ij}$$ where $\delta_{ij} = 0$ if $i \neq j$ and $1$ if $i = j$.

Since $\int_0^\infty x^{i+j} e^{-x} dx = (i+j)!$, it follows that

$$L^TAL = I$$ where $L$ is the $(n+1) \times (n+1)$ "Laguerre matrix" of order $n$ with $L_{m,k} = \frac{(-1)^k}{k!} {m \choose k}$ for $0 \leq k \leq m \leq n$ and $A_{ij} = (i+j)!$ for $0 \leq i,j \leq n$. This gives an explicit Cholesky decomposition

$$A = L^{-T}L^{-1}.$$

However, it's not clear to me if it's possible to get an explicit form for the eigenvalues or eigenvectors of $A$.

As an aside, I would note that it would not be hard to find the formula for the orthonormal polynomials with a bit of sleuthing. If you perform Gram-Schmidt on the sequence $1, x, x^2, \ldots$ with respect to the inner product $\langle p,q \rangle = \int_0^\infty p(x) q(x) e^{-x} \, dx$ then you get a solution, and by looking up the coefficients in OEIS you would learn about the Laguerre polynomials.

Jair Taylor
  • 17,307
  • 1
    I don't think the OP actually wanted or needed the eigenvectors of the Hankel matrix in the first place, and misunderstood the relationship between these eigenvectors and the actual objects they were searching for (namely these orthogonal polynomials). – Qiaochu Yuan Feb 02 '25 at 01:58
  • 1
    Ah I see, the solution $X$ is found via $X = L^{-1}$ from the Cholesky decomposition, I love it. Appreciate it and will remember the trick – Christopher D'Arcy Feb 02 '25 at 02:06
  • @QiaochuYuan I know, but nonetheless, the question as posed still stands and it's interesting enough in its own right. – Jair Taylor Feb 02 '25 at 17:47