In matrix multiplication, how does the number of linearly independent columns in the 1st matrix affect the number of linearly independent columns in the resultant matrix?
Setup:
Structure of AB
We have:
A = \begin{bmatrix} a & b \\ c & d \\ e & f \end{bmatrix}
B = \begin{bmatrix} g & h & i \\ j & k & l \end{bmatrix}
Multiplication gives:
AB = \begin{bmatrix} g\{a, c, e\} + j\{b, d, f\}, & h\{a, c, e\} + k\{b, d, f\}, & i\{a, c, e\} + l\{b, d, f\} \end{bmatrix}
Each column of AB is:
AB_1 = \begin{bmatrix} g\{a, c, e\} + j\{b, d, f\} \end{bmatrix}
AB_2 = \begin{bmatrix} h\{a, c, e\} + k\{b, d, f\} \end{bmatrix}
AB_3 = \begin{bmatrix} i\{a, c, e\} + l\{b, d, f\} \end{bmatrix}
Each column is a linear combination of just two base vectors:
\begin{bmatrix} a \\ c \\ e \end{bmatrix},
\begin{bmatrix} b \\ d \\ f \end{bmatrix}
What I know so far is:
We have to say that ATMOST 2 linearly independent columns (vectors) will be there in AB.
In case of B being all zeros (zero matrix), then AB would be all zeros, so only ONE linearly independent column in AB. (This, I understand).
In case of A having only ONE linearly independent column (and B is a nonzero matrix), then AB will only have ONE linearly independent column.
In case of A having TWO linearly independent columns (and B is a nonzero matrix), then AB will have TWO linearly independent columns.
My question is:
How can we predict that the columns of AB will be linearly independent of each other since the AB matrix is a linear combination of the original A columm1 and A column2. It is not a multiplication operation right, so no multiples of only one column. We are using the summation of two different columns altogether.. Summation of two things can't inherit the properties of the original two addends right. That doesn't make any sense. This isn't multiples.. So how are we coming to the above statements at all? Please let me know if any other elaboration is needed for understanding my question.