0

I can only find specific examples online,and I would like to know in general:

Given an $n \times n$-matrix \begin{align} A = \begin{pmatrix} 0 & b & 0 &\cdots & 0 \\ 0 & 0 & b &\cdots & 0\\ \vdots & \vdots & \vdots & \ddots &\vdots\\ 0 & 0 & 0 &\cdots & b \\ 0 & 0 & 0 &\cdots & 0 \\ \end{pmatrix} \end{align} for some $b$. How will $A^k$ look when $k<n$ ?

woulk like to know the general way it works.

3 Answers3

1

$A^2$ will have the only non-zero entries being on the second super-diagonal (the entries in row $i$, column $i+2$), and they will all be $b^2$. Then $A^3$ will have all $b^3$ along the third super-diagonal. In general, all non-zero entries will be $b^k$ on the $k^\mathrm{th}$ super-diagonal.

For example, $A^2$ looks like

$$ \left( \begin{array}{cccccc} 0 & 0 & b^2 & 0 & \cdots & 0 \\ 0 & 0 & 0 & b^2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \ddots & \ddots & \vdots \\ 0 & 0 & 0 & 0 & 0 & b^2 \\ 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 \end {array} \right) $$

Nick
  • 5,948
1

Here's a formal (inductive) proof for the formula of $A^k$: we wish to show that $$ [A^k]_{i,j} = \begin{cases} b^k & j-i = k\\ 0 & \text{otherwise} \end{cases} $$ where $[A]_{i,j}$ denotes the $i,j$ entry of $A$. The base case (either $k=0$ or $k=1$) holds trivially. For the inductive step: we note that if $i,j$ are between $1$ and $n$ $$ [A^{k+1}]_{i,j} = [A A^{k}]_{i,j} = \sum_{p=1}^{n} A_{ip}[A^k]_{pj} $$ We note that $A_{ip}[A^k]_{pj}$ is only non-zero if $A_{ip} \neq 0$ and $[A^k]_{pj} \neq 0$. By our definition of $A$, $A_{ip}$ will only be non-zero if $p = i+1$. On the other hand: by our inductive hypothesis, $[A^k]_{pj}$ will only be non-zero if $p = j-k$. These can only be simultaneously true if $i+1 = j-k$, which is to say that $j-i = k+1$. Thus, we conclude that $[A^{k+1}]_{i,j} = 0$ whenever $j-i \neq k+1$.

Whenever $j - i = k+1$, we compute $$ [A^{k+1}]_{i,j} = \sum_{p=1}^{n} A_{ip}[A^k]_{pj} = A_{i,(i+1)}[A^k]_{(j-k),j} = b \cdot b^k = b^{k+1} $$ The conclusion follows.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
0

Think about what happens to a vector when you multiply it by this matrix. Recalling that the columns of the matrix are the images of the basis vectors, it’s not too hard to work out that it shifts the elements of the vector “up” one place, bringing a zero into the last element, and then multiplies by $b$. So, applying this matrix twice shifts by two places and multiplies by $b^2$, thrice shifts by three and multiplies by $b^3$ and so on. To find the columns of the matrix that corresponds to any power of $A$, apply the corresponding operation to each of the standard basis vectors.

amd
  • 55,082