10

I would like to find the roots of a polynomial using its companion matrix.

The polynomial is ${p(x) = x^4-10x^2+9}$

The companion matrix $M$ is

$M={\left[ \begin{array}{cccc} 0 & 0 & 0 & -9 \\ 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 10 \\ 0 & 0 & 1 & 0 \end{array} \right]}$

A theorem says that the eigenvalues of $M$ are the roots of $p(x)$. I tried to find the characteristic polynomial of $M$ but it turned out to be $p(x)$. What should I do to obtain the eigenvalues of $M$?

  • 1
    Note $p(x) = x^4-10x^2+9 = x^4-10x^2+25 - 16 = (x^2 -5)^2 - 4^2 = ((x^2 - 5)-4)((x^2-5)+4) = (x^2 -9)(x^2-1) = (x-3)(x+3)(x-1)(x+1)$, so the zeros of $p$ are $\pm1, \pm 3$. Maybe you already knew that and you just want to figure out the companion matrix machinery, but they are the zeros for anyone that wants to check their answer. – Michael Albanese Feb 03 '13 at 13:48
  • 2
    I don't know where it came from, but I'd say that this is a pretty goofy problem. The point, I suppose, is that computing eigenvalues is a good way to find roots of polynomials (because software for finding eigenvalues is widely available and very well developed -- see packages like Linpack, etc.) That's all very valid. But, then, in this case, finding the eigenvalues is more difficult than finding the roots. It would have been much better to choose a polynomial whose roots are not so obvious. – bubba Feb 04 '13 at 06:04

3 Answers3

6

Since $p(x)$ is biquadratic, then if $\alpha$ is root, it follows that $-\alpha$ also is a root.

Looking at $M$ you can notice that if you sum the entries of each column, you'll always get $1$. This implies $1$ is an eigenvalue. (Do you know why?).

You have two roots now.

Continue with long division to find the remaining roots.

If you want to use the matrix to find all eigenvalues, recall that $\det (M)$ is the product of all eigenvalues. You can easily compute $\det (M)$ through expansion along the fourth column to find $\det (M)=9$.

Use the first sentence in my answer again to find the other eigenvalues.

Git Gud
  • 31,706
2

There exist numerical methods to find eigenvalues of matrices which use matrix - vector multiplication. Matrix-vector multiplication is very fast if the matrix is sparse as it is in this case.

One popular method is the power method. Roughly speaking, taking a random vector $\bf r$ and calculating ${\bf M}^k{\bf r}$ and ${\bf M}^{k+1}{\bf r}$ for a large $k$. Then the "parts" of $\bf r$ corresponding to the largest eigenvector should grow faster (exponentially faster with $k$) so that by elementwise division of those two vectors we can calculate the eigenvalue corresponding to that vector. Then there exist ways to sequentially "remove" or factor out parts of an eigenspace.

So a method doing this could look something like

  1. Generate $\bf r$ from a random distribution.
  2. Calculate ${\bf M}^k{\bf r}$ and ${\bf M}^{k+1}{\bf r}$ and divide them element wise to get largest eigenvalue.
  3. Factor out the eigenvalue-eigenvector pair by some method.
  4. Loop from 1 with a new random vector to get the second largest eigenvalue and so on until you have all you want.

One way to do the factorization step is described in "Matrix Analysis" by Horn and Johnson. Don't have the book right by me at the moment but can add a page number later if you are interested.

mathreadler
  • 26,534
2

It's no surprise you got the characteristic equation from the companion matrix, because the equation came from that precise matrix.

If you want to find roots of a polynomial using a companion matrix, you would use one of the specialized numerical methods for computing the eigenvalue of a matrix. Such methods do not use the characteristic polynomial and are tailored to the symmetry of the matrix. See, for example, Numerical Recipes (the link goes to the Householder Method, which is not applicable to your matrix, but gives you the general idea of what's involved).

Typically, you would use a package to find eigenvalues numerically because in general, such routines are very complex and not worth trying to figure out on your own. LINPACK, Matlab, Mathematica, etc., all lhave such routines for general matrices.

Ron Gordon
  • 141,538