1

If $n \ge 2$ and if $A$ is a matrix with rank($A$)=1,then the minimal polynomial of $A$ is of degree $2$.

So, we see that $rank (A) =1$.So, nullity of $A \ge 1$.Now we try to find the range space of $A$.If $y \in R(A)$ then $A(x)=y$.

We see that null space of $A$ is being spanned by $n-1$ linearly independent vectors. We can extend this to a basis of $\mathbb{R}^n$.Let $x_1,...,x_n$ be the basis where $x_1,..,x_{n-1} \in ker(A)$.We see that $x_n$ must not be in ker($A$) otherwise $A$ is the zero matrix so that $A=0$.

Let $x_n$ be the vector which spans the range space Of $A$ such that $A(\sum(c_ix_i))=x_n$ where $1 \le i \le n$.So from here we can conclude that $0,(c_n)^{-1}$ are the only eigenvalues values. Hence the minimal polynomial becomes $x(x-(c_n)^{-1})$ .

Guria Sona
  • 1,583
  • 8
  • 22

5 Answers5

3

I think it can be easily shown(without assuming the concept of diagonalizability) by using the result that Algebraic multiplicity of an eigen value is greater than or equal to its geometric multiplicity and by case wise determining the Ch eqn and min polynomial.
Proof: Suppose, $T:V\to V$ corresponding to the $n\times n$ matrix $A$, where $V$ is an 'n' dimensional vector space. $\beta$ be the standard basis i.e, $[T]_{\beta}=A$. Now from rank-nullity theorem,
$dim(Ker T)=(n-1)$(say), Assumed $\beta$ be the extended basis of this. [since, $dim(V)=n$ and $dim(Im T)=rank(A)=1$(given)]
Now, $A$ is an $n\times n$ matrix with rank $1$ where $n\geq 2$, so, $det(A)=0$. Since, $A$ is matrix of order $n$ , the characteristic eqn, $det(xI-A)=0$ is of degree $n$. Here, $det(A)=0$ so "$0$" is an eigen value because $x=0$ satisfies the ch. eqn. Now trying to find the geometric multiplicity of the eigen value $0$, i.e the dimension of the solution space of $(A-0.I)x=0$ i.e, $\to Ax=0$, which is same with $dim(Ker T)=(n-1)$. So, the eigen value $0$ has geometric multiplicity $(n-1)$. So, the algebraic multiplicity of $0$ is $\geq (n-1)$. Since Characteristic eqn of degree $n$ of the matrix $A$ will look like
i) $x^n=0$ or,
ii) $[x^{(n-1)}\times (x-c)]=0$ , where, $c\neq 0$, the other eigen value.
Now $x^n=0$ cannot be the Ch eqn for all such matrix $A$ , here is the contradictory example ,
take $A=$ $\begin{bmatrix} 1 & 2\\ 2 & 4 \end{bmatrix}$, a $2\times 2$ matrix of rank $1$ , here $x^2=0$ is not the Ch eqn. Check the Ch eqn is $x(x-5)=0$, which is of second type. As we have only two possibilities of Ch eqn then we can conclude that, The Ch eqn of such matrix $A$ (as given) is $[x^{(n-1)}\times (x-c)]=0$ , $c\neq 0$.

Now $(x-c)=0$ is not the minimal polynomial, because for all such $A$, $(A-cI)\neq 0$, we can assume that same contradictory example again to get contradiction s.t for $A=$ $\begin{bmatrix} 1 & 2\\ 2 & 4 \end{bmatrix}$ (rank=$1$ & order=$2$), $(A-5I)\neq 0$.And also since $rank(A)=1$ $\to$ $A\neq 0$ so $x=0$ is not the minimal polynomial.

Now we will check if the matrix $A$ can satisfy $x(x-c)=0$. Now $c$ is an non-zero eigen value having algebraic and geometric multiplicity $=1$. So, the dimension of the solution space of $(A-c.I)x=0$ is $1$, $x\epsilon V$.So, $dim(Ker(A-c.I))=1$ $\to$ $dim(Im(A-c.I))=(n-1)$
[Since, $(A-c.I)$ is $n\times n$ matrix $\simeq$ $T':V\to V]$. So, $rank(A-c.I)=(n-1)$ and $rank(A)=1$ , so $rank(A(A-c.I))=1$ or, $0$ [as, $r(AB)\leq min (r(A),r(B))$].
If $rank(A(A-c.I))=0$ then $A(A-c.I)=0$.
If $rank(A(A-c.I))=1$ we have $(A(A-c.I))$ is $n\times n$ matrix $\simeq$ $T'':V\to V$, so $dim(Ker(T''))=(n-1)$, we can easily check $x\epsilon Ker(T)$ $\to$ $x\epsilon Ker(T'')$ and dimension same so, $Ker(T)=Ker(T'')$. Now, there exist non zero $x$ $\epsilon$ $Ker(T')$ [As $dim(Ker(T'))=dim(Ker(A-c.I))=1$], then $(A-c.I)x=0$ $\to$ $Ax=cx\neq 0$ so, $x$ does not belong to $Ker(T)$, but $A(A-c.I)x=A^2x-cAx=A(cx)-c^2x=c^2x-c^2x=0$, so $x$ $\epsilon$ $Ker(T'')$ , So, $Ker(T)\neq Ker(T'')$ , hence a contradiction. So, $A(A-c.I)=0$ and $x(x-c)=0$ be the minimal polynomial of degree $2$ for all such $A$ (as given). [Hence proved]

I have tried to do in a new way with the basic concepts only.

1

Let $x_1,...,x_n$ a basis such that $(x_1,...,x_{n-1})$ is a basis of $ker A$ and $x_n$ a basis of $Im A$. We have $A(x_n)=cx_n$, $c\neq 0$ since $A\neq 0$. Write $P=X(X-c)$, $P(A)(e_i)=(A-cI)(A(x_i)=0$ if $i<n$. $P(A)(x_n)=A(A-cI)(x_n)=0$. Implies that $P(A)=0$.

  • 1
    @Guria Sona: Note that this argument shows that the minimal polynomial must divide $X(X-c)$. However, it is easy to argue that it cannot be of degree one since $n \geq 2$ and $\operatorname{rank}(A) = 1$ so $A$ is not a multiple of the identity matrix (since a multiple of the identity matrix has either rank $0$ or rank $n$). – levap Feb 14 '21 at 14:25
  • $[0, 1: 0, 0]$ is rank $1$ matrix without non-zero eigenvalue. What about such case? – ogirkar Mar 04 '23 at 09:54
1

There are many problems with the way your solution is written. You start with $n - 1$ linearly independent vectors $x_1, \dots, x_{n-1}$ which span the null space of $A$ and say that you extend them to a basis $x_1, \dots, x_{n}$ of $\mathbb{F}^n$. A priori, if this extension is performed arbitrary there is no reason that $x_n$ will span the range of $A$. You need to argue that if you pick a non-zero element $x_n$ of the range space of $A$, then $x_1, \dots, x_n$ will be linearly independent and so will form a basis of $\mathbb{F}^n$. Then $Ax_1 = \dots = Ax_{n-1} = 0$ while $Ax_n = cx_n$ for some $c \in \mathbb{F}$ (because the range space is one-dimesional). This shows that $0$ and $c$ are the only eigenvalues of $A$ and since $0$ has an $n-1$ dimensional eigenspace and $c$ has a one-dimensional eigenspace, this means that those are the only eigenvalues and so the characteristic polynomial of $A$ is $x^{n-1}(x - c)$.

Finally, you somehow "deduce" from the fact that there are only two eigenvalues that the minimal polynomial is $x(x-c)$ but you need to justify this. The minimal polynomial divides the characteristic polynomial but a priori it can be $x^2(x-c)$ or $x - c$ and so on. Why is it precisely $x(x-c)$? One way is to say that $A$ is diagonalizable and so the minimal polynomial is the product of the linear factors which are associated to the distinct eigenvalues.

levap
  • 67,610
  • Yes, I agree. My intuition has been to show that there are n distinct eigenvectors then from that it can be said that $A$ is diagonalizable so the minimal polynomial is of degree $2$ as the way I have proved. Yes, I should have written it down carefully. – Guria Sona Feb 14 '21 at 14:02
0

Suppose $A$ is an $n\times n$ matrix. $(n\geq 2)$

Since rank is equal to $1$, nullity is $n-1$.

That is geometric multiplicity(GM) of eigenvalue $0$ is $n-1$.

Since for any eigenvalue, algebraic multiplicity(AM) is greater than or equal to geometric multiplicity, AM$(0)\geq n-1$.

Case $1$: When AM$(0)=n-1$:

So there is one non-zero eigenvalue, say $c$, with AM equal to 1, and thus GM also equal to $1$.

Therefore, for all the eigenvalues of $A$, we have AM$=$GM, so $A$ is diagonalizable. That means $m_A(x)$ is product of distinct linear factors, which is nothing but $x(x-c)$.

Case $2$: When AM$(0)=n$

Since GM$(0)=n-1$, in JC form of $A$, there are $n-1$ jordan blocks of eigenvalue $0$, in which exactly one block will be of size $2$ and other blocks will be of size $1$, which immediately leads to $m_A(x)=x^2$.

ogirkar
  • 2,844
0

The case $A \in Mat_{1 \times 1}(\mathbb{k})$ is trivial: $\chi_A(t) = t-\alpha = \mu_A(t), \ \deg(\mu_A(t))=1, \ \chi_A(t)$ is the characteristic polynomial of $A$, $\mu_A(t)$ is the minimal polynomial of $A$. If $A \in Mat_{2 \times 2}(\mathbb{k})$ then $\mu_A(t) = t(t-c)$ because we can "kill" the second row of a corresponding matrice with the first one.
Now consider $A \in Mat_{n \times n}(\mathbb{k}), \ n>2.$ Let's look at Frobenius normal form of $A$ (for example, you can see what it is here: https://sites.math.washington.edu//~julia/teaching/505_Winter2010/WK1.pdf): there is only one non-zero block, otherwise there are at least two linearly independent columns. The size of the block is more than $1$ because $n>1$ the characteristic polynomial is $t^k \cdot p(t)$, where $p(t)$ is the minimal polynomial of $A$. If the size is more than $2$, we have a minor \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} with rank equal to 2. So we have only one case left with a polynomial of $\deg = 2$: \begin{pmatrix} - \alpha_1 & 1 \\ - \alpha_2 & 0 \end{pmatrix} with $\alpha_2 = 0$. So $\mu_A(t) = t(t+\alpha_1), \ \deg(\mu_A(t)) = 2.$