-2

In matrix multiplication, how does the number of linearly independent columns in the 1st matrix affect the number of linearly independent columns in the resultant matrix?

Setup:

Structure of AB

We have:

A = \begin{bmatrix} a & b \\ c & d \\ e & f \end{bmatrix}

B = \begin{bmatrix} g & h & i \\ j & k & l \end{bmatrix}

Multiplication gives:

AB = \begin{bmatrix} g\{a, c, e\} + j\{b, d, f\}, & h\{a, c, e\} + k\{b, d, f\}, & i\{a, c, e\} + l\{b, d, f\} \end{bmatrix}

Each column of AB is:


AB_1 = \begin{bmatrix} g\{a, c, e\} + j\{b, d, f\} \end{bmatrix}


AB_2 = \begin{bmatrix} h\{a, c, e\} + k\{b, d, f\} \end{bmatrix}


AB_3 = \begin{bmatrix} i\{a, c, e\} + l\{b, d, f\} \end{bmatrix}

Each column is a linear combination of just two base vectors:

\begin{bmatrix} a \\ c \\ e \end{bmatrix},
\begin{bmatrix} b \\ d \\ f \end{bmatrix}


What I know so far is:

We have to say that ATMOST 2 linearly independent columns (vectors) will be there in AB.

In case of B being all zeros (zero matrix), then AB would be all zeros, so only ONE linearly independent column in AB. (This, I understand).

In case of A having only ONE linearly independent column (and B is a nonzero matrix), then AB will only have ONE linearly independent column.

In case of A having TWO linearly independent columns (and B is a nonzero matrix), then AB will have TWO linearly independent columns.


My question is:

How can we predict that the columns of AB will be linearly independent of each other since the AB matrix is a linear combination of the original A columm1 and A column2. It is not a multiplication operation right, so no multiples of only one column. We are using the summation of two different columns altogether.. Summation of two things can't inherit the properties of the original two addends right. That doesn't make any sense. This isn't multiples.. So how are we coming to the above statements at all? Please let me know if any other elaboration is needed for understanding my question.

  • See https:https://math.stackexchange.com/questions/978/how-to-prove-and-interpret-operatornamerankab-leq-operatornamemin-ope – Widawensen Apr 06 '25 at 15:52

1 Answers1

1

Your question is unclear. I assume you are asking "what is the relationship between the number of linearly independent columns in two matrices $A$ and $B$, and their product $AB$?" which is answered here.

An informal justification for this fact is that a linear transformation of $A$ (multiplying by $B$) can't make dependent columns independent, so it must be less than $\text{rank}(A)$. At the same time the transformation $B$ could map $A$ to a "smaller" space and "create dependence", so we also consider $\text{rank}(B)$.

Also, it is not true that if $B = 0_{n \times m}$ then $AB = 0_{n \times m}$ has one linearly independent column. Any set of vectors with $\vec{0}$ is always dependent, including $\{\vec{0}\}$ alone. $0_{n \times m}$ has no linearly independent columns.

Rob S.
  • 43