So firstly, these kinds of properties were not directly addressed in the linear algebra course I have taken, so there might be some "language barrier" holding me back. In any case, I presume that a $k \times k$ minor refers to the value of the $k \times k$ determinant. Given the knowledge that all $k \times k$ minors of a matrix $A$ are zero, what else do we know about the matrix? I honestly do not know, as besides performing manual calculations, the only "theory" and practise behind determinants that I have encountered has been the recursive definition with subdeterminant rule(s).
-
4It tells you that the rank of $A$ is $<k$. – GreginGre Sep 10 '20 at 15:10
-
@GreginGre Where does this come from? Specifically, why is the rank less than $k$? I do not know is it related, but it just occurred to me that since the $2 \times 2$ is the base case of any determinant calculation it follows that $det(A) = 0$. – Epsilon Away Sep 10 '20 at 15:15
-
this is a standard fact in linear algebra, that you can find in any linear algebra textbook. – GreginGre Sep 10 '20 at 16:55
-
@Greg: really? I've worked with several linear algebra textbooks with students and I don't think this fact is in any of them. – Qiaochu Yuan Sep 10 '20 at 18:52
-
another approach: do ex 4.Misc.10 in Artin 1st edition as a lemma: $\text{rank}(A)=r$ iff $A$ has some $r \times r$ minor that is non-zero, and all $j \times j$ minors are zero for $j\gt r$... Now for this problem since all $k \times k$ minors are zero, this implies all $k +1 \times k+1$ minors are zero (pick an arbitrary one and do expansion by minors to see a sum with $k+1$ terms each consisting of a scalar times zero), and by induction all $j \times j$ minors are zero for $j \geq k$; by the lemma the rank of the matrix is $\lt k$. – user8675309 Sep 10 '20 at 23:23
1 Answers
The $k \times k$ minors vanish if and only if the rank of $A$ is less than $k$, as Greg says in the comments. Here's a relatively low-tech way to see this.
A matrix $A$ has rank less than $k$ iff the dimension of the column space $\text{col}(A)$ is less than $k$. The dimension of the column space is the maximum size of a linearly independent set of columns, so to check this condition it suffices to check whether every $k$-tuple of columns is linearly dependent. In other words, to check that a matrix $A$ has rank less than $k$ it suffices to check whether all the matrices obtained by selecting $k$ columns of $A$ (and ignoring the rest) have rank less than $k$.
But the dimension of the column space is the dimension of the row space, so exactly the same argument applies to the rows of $A$. If we apply the argument to the columns first, producing a bunch of matrices with $k$ columns, and then apply the argument to the rows of these matrices, we conclude that a matrix $A$ has rank less than $k$ iff all the $k \times k$ submatrices obtained by selecting $k$ columns and $k$ rows of $A$ (and ignoring the rest) have rank less than $k$. But since these are $k \times k$ matrices they have rank less than $k$ iff their determinant vanishes.
A higher-tech way to see this result is to show that the $k \times k$ minors appear as the coefficients of the exterior power of $A$, and to show that $A$ has rank less than $k$ iff its $k^{th}$ exterior power vanishes. Developing the theory of exterior powers tells you much more than this about the minors, for example it tells you they transform in a particular way under change of coordinates and that there's a formula for the minors of $AB$ in terms of the minors of $A$ and $B$, neither of which is otherwise clear.
- 468,795