0

How would you decide whether a tridiagonal matrix with all ones in the diagonals has only a trivial solution (as matrix b is zero in the equation Ax=b)?

Edit: So, a general solution to an n by n matrix of the following appearance:

$\begin{bmatrix} 1& 1& 0& 0\\ 1& 1& 1& 0\\ 0& 1& 1& 1\\ 0& 0& 1& 1\\ \end{bmatrix}$

2 Answers2

0

The eigendecomposition of a tridiagonal Toeplitz matrix is well known. Let $n$ denote the size of the matrix $A$. The equation $Ax = 0$ will have a non-trivial solution if and only if $A$ has $0$ as one of its eigenvalues, which in our case means that there is an integer $1 \leq k \leq n$ for which $$ 1 + 2 \cos\left(\frac{\pi k}{n+1}\right) = 0 \implies \cos\left(\frac{\pi k}{n+1}\right) = -\frac12. $$ Because $\cos^{-1}(-1/2) = 2 \pi /3$, we can conclude that $A$ will have a non-trivial solution for $Ax = 0$ if and only if $n+1$ is divisible by $3$, which is to say that $n \equiv 2 \pmod 3$.

We also know that in these cases, the solution space will be one-dimensional and spanned by the vector $$ v = \pmatrix{ \sin\left(\frac{2 \pi}{3} \right) & \sin\left(\frac{4 \pi}{3} \right) & \cdots & \sin\left(\frac{2n \pi}{3} \right)}^\top = \\ \frac {\sqrt{3}}{2}\pmatrix{1 & -1 & 0 & 1&-1&0&\cdots & 1 & -1}. $$

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
0

Let's compute some determinants recursively. Surely there is some pattern in here that we can exploit.

$1\times 1$ \begin{equation} |[1]| = 1 \end{equation}

$2\times 2$ \begin{align*} \left| \begin{bmatrix} 1 & 1\\ 1 & 1 \end{bmatrix} \right| =0 \end{align*}

$3\times 3$ - expanding on the last row, then on the last column \begin{align*} \left| \begin{bmatrix} 1 & 1 & 0\\ 1 & 1 & 1\\ 0 & 1 & 1 \end{bmatrix} \right| &= (-1) \left| \begin{bmatrix} 1 & 0\\ 1 & 1\\ \end{bmatrix} \right| + (1) \left| \begin{bmatrix} 1 & 1\\ 1 & 1\\ \end{bmatrix} \right| \\ &= (-1)(1) \left| \begin{bmatrix} 1 \end{bmatrix} \right| + (1) \left| \begin{bmatrix} 1 & 1\\ 1 & 1\\ \end{bmatrix} \right| \\ &= -1 + 0 = -1 \end{align*}

$4\times 4$ - expanding on the last row, then on the last column \begin{align*} \left| \begin{bmatrix} 1 & 1 & 0 & 0\\ 1 & 1 & 1 & 0\\ 0 & 1 & 1 & 1\\ 0 & 0 & 1 & 1 \end{bmatrix} \right| &= (-1) \left| \begin{bmatrix} 1 & 1 & 0\\ 1 & 1 & 0\\ 0 & 1 & 1 \end{bmatrix} \right| + (1) \left| \begin{bmatrix} 1 & 1 & 0 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{bmatrix} \right|\\ &= (-1)(1) \left| \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix} \right| + (1) \left| \begin{bmatrix} 1 & 1 & 0 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \end{bmatrix} \right|\\ &= 0+(-1) = -1 \end{align*}

$5\times 5$ - expanding on the last row, then on the last column \begin{align*} \left| \begin{bmatrix} 1 & 1 & 0 & 0 & 0\\ 1 & 1 & 1 & 0 & 0\\ 0 & 1 & 1 & 1 & 0\\ 0 & 0 & 1 & 1 & 1\\ 0 & 0 & 0 & 1 & 1\\ \end{bmatrix} \right| &= (-1) \left| \begin{bmatrix} 1 & 1 & 0 & 0\\ 1 & 1 & 1 & 0\\ 0 & 1 & 1 & 0\\ 0 & 0 & 1 & 1\\ \end{bmatrix} \right| + (1) \left| \begin{bmatrix} 1 & 1 & 0 & 0 \\ 1 & 1 & 1 & 0 \\ 0 & 1 & 1 & 1 \\ 0 & 0 & 1 & 1 \\ \end{bmatrix} \right|\\ &= (-1)(1) \left| \begin{bmatrix} 1 & 1 & 0 \\ 1 & 1 & 1 \\ 0 & 1 & 1 \\ \end{bmatrix} \right| + (1) \left| \begin{bmatrix} 1 & 1 & 0 & 0 \\ 1 & 1 & 1 & 0 \\ 0 & 1 & 1 & 1 \\ 0 & 0 & 1 & 1 \\ \end{bmatrix} \right|\\ &= 1 + (-1) = 0 \end{align*}

Dunham
  • 3,357