0

I've been trying to understand generalizations of matrix multiplications, in particular for "cube" matrices ($n \times m \times p$). I found the following answer,

Is there a 3-dimensional "matrix" by "matrix" product?

and read the wikipedia page mentioned there, but my experience with tensors is somewhat limited (I've seen their definitions in module theory and done some proofs about them, but I've done very few actual computations with them), so I'm really struggling to figure out how to apply the idea of a contraction.

If $A$ is $n \times m \times p$ and $B$ is $q \times r \times s$, then what conditions on those dimensions are necessary to facilitate a "multiplication" $AB$? How does it relate to a contraction? What is the resultant? I can't figure out what the vector spaces $V$ and $V^*$ need to be to relate this situation to a contraction.

The answer in the stack overflow page above gives a rank 4 tensor example, but I can't seem to make it work in the rank 3 tensor example. What am I missing?

Tyler
  • 2,353

1 Answers1

1

You can make the outer product of any two matrices. It will have as many indices as the sum of the two matrices you start with. If you start with vectors $\vec u$ and $\vec v$ with coordinates indicated as $u_i, v_j$ the outer product is $\bf w=\vec u \otimes \vec v$ with $w_{ij}=u_iv_j$. Similarly you can form the outer product of your $A$ and $B$, getting a matrix with six indices. If any pair of those six dimensions are the same, you can contract the tensor over that pair. That is what we do when we form the dot product of two vectors. Take our outer product $\bf w$ and sum over the pair of indices and you get $\vec u \cdot \vec v=\sum_i \bf {w_{ii}}$. Similarly, you can view regular two index matrix multiplication as forming the outer product and then contracting one pair of indices. The usual rule for matrix multiplication insists that the proper dimensions be the same to allow the contraction.

Ross Millikan
  • 383,099