0

I am interested in solving the following:

Let $\lambda_1,\dots,\lambda_r$ be distinct, nonzero complex numbers. Prove that the matrix

$$L := \begin{bmatrix} \lambda_1 & \lambda_2 & \dots & \lambda_r \\ \lambda_1^2 & \lambda_2^2 & \dots & \lambda_r^2\\ \vdots & \vdots & \ddots & \vdots\\ \lambda_1^r & \lambda_2^r & \dots & \lambda_r^r\\ \end{bmatrix}$$

has a trivial null space. Since $L$ is square, it suffices to show any of the equivalent statements of the invertible matrix theorem, but I'm not quite sure which statement is the most natural. On the other hand, it appears to be very similar to the so called "Vandermode Matrices" as described on Wikipedia here https://en.wikipedia.org/wiki/Vandermonde_matrix . However, the examples on this page all have a row (or column) of 1's. I was wondering if there is any clean way to adapt this problem into the more well known Vandermode matrix problem. Of course, if I am thinking about this the wrong way I would be happy to be corrected.

  • I am aware, as well, of this question on this website https://math.stackexchange.com/questions/4236864/matrix-of-powers-of-eigenvalues but the author does not provide any insight to the problem. – Important_man74 Dec 15 '23 at 01:37
  • 2
    hint: write $L$ as a product of a Vandermonde Matrix and a Diagonal matrix – user8675309 Dec 15 '23 at 02:13
  • Oh interesting I didn't think about that. Alternatively, I think if you just consider the determinant, you can divide each column by its "representative" λ then it becomes the product of the λ's times the determinant of a Vandermode Matrix, which is nonzero in this case. Thanks for your perspective too. I will think about that. – Important_man74 Dec 15 '23 at 02:34

0 Answers0