1

This is my first post here, so I hope I'm doing everything right. So I watched the Essence of Linear Algebra series by 3Blue1Brown, and thought a lot about the whole principle of rather seeing vectors as a representation of how many times you'd run the corresponding basis vector of each dimension.

For example in the $2$D space, there is $\hat\imath$ and $\hat\jmath$ which have clear representations with $\smash{\left[\begin{smallmatrix} 1 \\ 0 \end{smallmatrix}\right]}$ and $\smash{\left[\begin{smallmatrix} 0 \\ 1 \end{smallmatrix}\right]}$. Now a matrix basically represents the new locations of where $\hat\imath$ and $\hat\jmath$ landed and we can derive any other vector from the new location of $\hat\imath$ and $\hat\jmath$. So this multiplying a $2$D vector by a $2\times2$ matrix makes total sense to me (geometrically).

In this example of a $1\times3$ matrix $$ A = \begin{bmatrix} 3 & 1 & 4 \end{bmatrix} $$ which basically would represent $3$ basis vectors $\begin{bmatrix} \hat\imath & \hat\jmath & \hat k \end{bmatrix}$ with only 1 dimension each and multiplying it with the matrix $$ B = \begin{bmatrix} 4 & 3 \\ 2 & 5 \\ 6 & 8 \end{bmatrix} $$

which represents a transformation of two basis vectors $\begin{bmatrix} \hat\imath & \hat\jmath \end{bmatrix}$ with $3$ dimensions each.

This concept makes no geometrical sense anymore, or am I mistaken here?

I don't even know if I am supposed to, but I can't wrap my head around what it would mean to multiply $$ A B = \begin{bmatrix} 3 & 1 & 4 \end{bmatrix} \begin{bmatrix} 4 & 3 \\ 2 & 5 \\ 6 & 8 \end{bmatrix} $$

So are we simply deriving the concept here of multiplying rows by columns and this has no further reason you can think of as an image in your head?

Also I'm pretty much learning this for the cause of understanding matrix multiplication in Machine Learning to be more comfortable with thinking about tensor shapes.

Glad for any thoughts on this question. I think it should be clear with these simple examples.

Cheers!

Sammy Black
  • 28,409
  • 1
    Do you know what a linear map is? A matrix is a linear map with respect to a basis. Matrix multiplication is composition of the respective linear maps. – Malady Jan 30 '25 at 01:24
  • This question is similar to: Usual Matrix Multiplication. If you believe it’s different, please [edit] the question, make it clear how it’s different and/or how the answers on that question are not helpful for your problem. – Malady Jan 30 '25 at 01:26
  • 1
    Here's one viewpont: Suppose we have a vector $x \in \mathbb R^n$ and we first multiply it by an $m \times n$ matrix $A$, then multiply the result by a $k \times m$ matrix $B$. It turns out that this is equivalent to multiplying $x$ by a certain matrix $C$. This matrix $C$ is called the product of $B$ and $A$. In other words, $C = BA$ is defined so that $B(Ax) = Cx$ for all $x \in \mathbb R^n$. – littleO Jan 30 '25 at 01:34
  • @Malady first of all thank You for the quick reply! To answer your question how my Problem is different to the proposed problem you linked: I completely understood, why we calculate rows by columns, but I wonder, if there is any Visual representation I can think of, in multiplying matrices with unequal dimensions. Also I don’t know what a linear map is, Ill have a look at it tomorrow! – MielkeDaniel Jan 30 '25 at 01:38
  • The answer to that question described how matrix multiplication as composition of functions. That is the geometric answer. – Malady Jan 30 '25 at 01:42
  • Note that in your example, the multiplication $AB$ is possible and will be a $1\times 2$ matrix, but $BA$ is not possible. You could see $B$ as two 3D vectors and $A$ as a transformation $\mathbb R^3 \to \mathbb R^1$ with $AB$ being two 1D vectors. Or looking the other way you could see $A$ as one 3D vector and $B$ as a transformation $\mathbb R^3 \to \mathbb R^2$ with $AB$ being one 2D vector. Or you could see $A$ and $B$ as being two transformations and $AB$ their combined transformation. – Henry Jan 30 '25 at 01:45
  • @Henry that makes Sense! So I can rather think of matrices as linear functions, than to think of them as some function that takes in a vector in space and transforms it along some dimensions (even tho this would kind of still be the case)? – MielkeDaniel Jan 30 '25 at 01:49

0 Answers0