11

During my course in linear algebra, the instructor stated that A cross B is the same as the "skew symmetric matrix" of A times B. So, first of all, can someone clarify or provide sources about skew symmetric matrices? Secondly, I can't really comprehend the idea of how a single column vector crossed with another could be represented by a matrix.

Anyhow, thanks in advance!

Abu Bakr
  • 365
  • See https://en.wikipedia.org/wiki/Cross_product#Conversion_to_matrix_multiplication –  Apr 23 '17 at 18:11

5 Answers5

14

Imagine a column vector ${\bf A} = (A_1, A_2, A_3)$ and define the matrix

$$ A_\times = \left(\begin{array}{ccc} 0 & -A_3 & A_2 \\ A_3 & 0 & -A_1 \\ -A_2 & A_1 & 0 \end{array}\right) $$

Note that if ${\bf B}$ is another column vector, then

$$ A_\times {\bf B} = {\bf A}\times {\bf B} $$

Moreover

$$ {\rm Transpose}(A_\times) = -A_\times $$

caverac
  • 19,783
  • 3
    I don't really get this part: A×B=A×B , can you clarify? – Abu Bakr Apr 23 '17 at 18:24
  • 1
    I think that the "St Andrew cross" should be as an index (see the way I write it in (https://math.stackexchange.com/q/2239153)) – Jean Marie Apr 23 '17 at 19:24
  • @JeanMarie Thanks for the suggestion – caverac Apr 23 '17 at 22:00
  • @AbuBakr Please see JeanMarie's comment – caverac Apr 23 '17 at 22:01
  • 2
    $$\begin{pmatrix} x \ y \ z \end{pmatrix} \times \begin{pmatrix} a \ b \ c \end{pmatrix} = \begin{bmatrix} 0 & -z & y \ z & 0 & -x \ -y & x & 0 \end{bmatrix} \begin{pmatrix} a \ b \ c \end{pmatrix} $$ the 3×3 skew symmetrix matrix above is the linear algebra representation of the cross product operator. – John Alexiou Apr 23 '17 at 23:15
  • Hi @JohnAlexiou using this approach could you please help with https://math.stackexchange.com/q/4396103/585488 – linker Mar 06 '22 at 18:10
6

The skew-symmetric tensor product of two vectors with components $A_i$ and $B_i$ is the tensor represented by the matrix with components $S_{ij}=A_iB_j - A_jB_i$. It is skew-symmetric (antisymmetric) because $S_{ij}=-S_{ji}$.

The advantage of this representation is that unlike the vector cross product, which is specific to three dimensions, the skew-symmetric product generalizes the concept to arbitrary dimensions.

Explicitly (in three dimensions),

$$A_iB_j-A_jB_i=\begin{pmatrix}0&A_1B_2-A_2B_1&A_1B_3-A_3B_1\\A_2B_1-A_1B_2&0&A_2B_3-A_3B_2\\A_3B_1-A_1B_3&A_3B_2-A_2B_3&0\end{pmatrix}.$$

  • This is what I don't get: Sij=AiBj−AjBi How can A and B vector have 2 components as in i and j? – Abu Bakr Apr 23 '17 at 18:10
  • 1
    Let me give an example. Say, $A=[1,2,3]$, $B=[4,5,6]$. So, $A_1=1$, $A_2=2$, $A_3=3$, $B_1=4$, $B_2=5$, $B_3=6$. Then $S_{12}=A_1B_2-A_2B_1 = 5-8=-3$. $S_{23}=A_2B_3-A_3B_2=12-15=-3$. And so on. So the $i$ and $j$ indices just cycle through all possible values between 1 and $D$ (for $D$ dimensions) individually, and independently of each other. – Viktor Toth Apr 23 '17 at 18:16
  • Hi @ViktorToth could you please help out with https://math.stackexchange.com/q/4396103/585488 – linker Mar 06 '22 at 18:08
3

We can substitute vector product $ \mathbf{a} \times \mathbf{b}$ by multiplying the vector $\mathbf{b}$ by a matrix because skew-symmetric matrix corresponding to the first vector $\mathbf{a}$ is defined as

$S(\mathbf{a})=[\mathbf{a} \times \mathbf{i} \ \ \mathbf{a} \times \mathbf{j} \ \ \mathbf{a} \times \mathbf{k} ]$,
where $\mathbf{i},\mathbf{j},\mathbf{k}$ are standard basis vectors forming as columns identity matrix $ {I} = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 &0 \\ 0 & 0 &1 \end{bmatrix}$.

This gives formula presented above by caverac (you can notice for example that columns are (easy to check) orthogonal vectors to both $\mathbf{a}$ and appropriate standard basis vectors $\mathbf{i},\mathbf{j},\mathbf{k}$ - also lengths of $S(\mathbf{a})$ columns are coherent with properties of cross product for this case).

In this case we have below formula with the use of multiplication the vector by the matrix interpreted as the sum of products of vector columns of matrix by components of vector (scalars):

$S(\mathbf{a})\mathbf{b}= (\mathbf{a} \times \mathbf{i})b_x + (\mathbf{a} \times \mathbf{j})b_y + (\mathbf{a} \times \mathbf{k}) b_z =\mathbf{a} \times (b_x\mathbf{i} + b_y\mathbf{j} + b_z\mathbf{k} )=\mathbf{a} \times \mathbf{b} $,

$b_x , b_y , b_z$ are coordinates of $\mathbf{b}$ vector.

Widawensen
  • 8,517
3

$\mathbf{a} \times \mathbf{b} = (a_{x}\mathbf{i} + a_{y}\mathbf{j} + a_{z}\mathbf{k}) \times \mathbf{b} $

Cross products can be explained as the rotation of one vector about another vector as the axis.

But rotations can also be described as multiplications of matrices. For example, rotation about the $x$ axis is given by the $\Omega_{x}$ matrix. So now we just need to find the rotation matrix $M_{a}$ corresponding to $\mathbf{a}$.

$\mathbf{a} \times \mathbf{b} = (a_{x}\mathbf{i} + a_{y}\mathbf{j} + a_{z}\mathbf{k}) \times \mathbf{b} = M_{a}\ \mathbf{b} $

I am not very clear here. But hopefully, someone can elaborate better.

For rotations we have

  • $\Omega_{x}$ matrix for rotation about $x$ axis \begin{bmatrix}0&&0&0\\ 0&&0&-1\\ 0&&1&0\end{bmatrix}

  • $\Omega_{y}$ matrix for rotation about $y$ axis \begin{bmatrix}0&&0&&1\\ 0&&0&&0\\ -1&&0&&0 \end{bmatrix}

  • $\Omega_{z}$ matrix for rotation about $z$ axis \begin{bmatrix}0&-1&&0\\ 1&0&&0\\0&0&&0 \end{bmatrix}

All these matrices are skew-symmetric. (as one can see by inspection). One can compose these matrices to form the rotation matrix $M_{a}$. As it turns out the rotation about the vector $\mathbf{a}$ is given by the matrix $M_{a} = a_{x}\Omega_{x} + a_{y}\Omega_{y} + a_{z}\Omega_{z} =$ $\begin{bmatrix}0&-a_{z}&a_{y}\\ a_{z}&0&-a_{x}\\ -a_{y}&a_{x}&0 \end{bmatrix} $ which is also skew symmetric by construction.

Thus, $\mathbf{a} \times \mathbf{b} = (a_{x}\mathbf{i} + a_{y}\mathbf{j} + a_{z}\mathbf{k}) \times \mathbf{b} = \begin{bmatrix}0&-a_{z}&a_{y}\\ a_{z}&0&-a_{x}\\ -a_{y}&a_{x}&0 \end{bmatrix} \begin{bmatrix}b_{x}\\b_{y}\\b_{z}\end{bmatrix}$

Probably this can also be done in 2 D with Pauli matrices.

1

The crucial insight here is that the map $w \mapsto v \times w$ that represents the (left) cross product of a vector $w$ by a vector $v$ is linear, and since every linear map on a finite-dimensional vector space has a matrix representation (see here for a proof this), then the cross-product operator has a matrix representation. Additionally, because the cross-product operation is anti-commutative, then it follows that the matrix representation of the cross-product is skew-symmetric.

More specifically, consider the finite-dimensional vector-space $(\mathbb R^3, \mathbb R, +, \cdot)$. Given the coordinate vector $v \in \mathbb R^3$, define the operator $T_v : \mathbb R^3 \to \mathbb R^3$ as $T_v(w) = v \times w$,. That is, $T_v$ is the (left) cross-product operation with $v$.

We will now show that $T_v$ is a linear operator. First, fix any $\lambda \in \mathbb R$ and $w \in \mathbb R^3$. Then, $$ \begin{align} T_v(\lambda \cdot w) &= v \times (\lambda \cdot w) \\ &= \lambda \cdot (v \times w) \\ &= \lambda \cdot T_v(w) \end{align} $$ which follows from the property that the cross-product is compatible with a scalar. Next, fix any $w \in \mathbb R^3$ and $y \in \mathbb R^3$. Then, $$ \begin{align} T_v(w + y) &= v \times (w + y) \\ &= (v \times w) + (v \times y) \\ &= T_v(w) + T_v(y) \end{align} $$ which follows from the fact that the cross-product is distributive over vector addition. Hence, the operator $T_v$ is linear.

Because every linear map between finite-dimensional vector spaces has a matrix representation, then we would like to determine the matrix representation of $T_v$.

Let $(e_x, e_y, e_z)$ be the standard basis for $\mathbb R^3$. Recall that, for any vector $w \in \mathbb R^3$, $$ \begin{align} T_v(w) &= T_v(w_xe_x + w_ye_y + w_ze_z) \\ &= w_xT_v(e_x) + w_yT_v(e_y) + w_zT_v(e_z) \\ &= \begin{bmatrix}T_v(e_x) & T_v(e_y) & T_v(e_z)\end{bmatrix}\begin{bmatrix}w_x \\ w_y \\ w_z\end{bmatrix} \end{align} $$ Hence, the matrix representation of $T_v$ is $\begin{bmatrix}T_v(e_x) & T_v(e_y) & T_v(e_z)\end{bmatrix}$, where the first column of the matrix corresponds to the vector $T_v(e_x)$, the second column corresponds to $T_v(e_y)$, and the third column corresponds to $T_v(e_z)$.

We then compute each of these columns as follows: $$ \begin{align} T_v(e_x) &= v \times e_x \\ &= -(e_x \times v) \\ &= -(e_x \times [v_xe_x + v_ye_y + v_ze_z]) \\ &= 0e_x + v_ze_y - v_ye_z \\ &= \begin{bmatrix} 0 \\ v_z \\ -v_y\end{bmatrix} \\ T_v(e_y) &= v \times e_y \\ &= -v_ze_x + 0e_y + v_xe_z \\ &= \begin{bmatrix} -v_z \\ 0 \\ v_x\end{bmatrix} \\ T_v(e_z) &= v \times e_z \\ &= v_ye_x - v_xe_y + 0e_z \\ &= \begin{bmatrix} v_y \\ -v_x \\ 0\end{bmatrix} \end{align} $$ Hence, the matrix representation of $T_v$ is $$ \begin{bmatrix}T_v(e_x) & T_v(e_y) & T_v(e_z)\end{bmatrix} = \begin{bmatrix}0 & -v_z & v_y \\ v_z & 0 & -v_x \\ -v_y & v_x & 0\end{bmatrix} $$ Note that this matrix is skew-symmetric. This is because the cross-product operation is anti-commutative, such that $v \times w = -(w \times v)$ which implies that $(v \times w) + (w \times v) = 0$. This is similar to the definition of a skew-symmetric matrix $A$, such that $A + A^T = 0$.

To summarize, for every $w \in \mathbb R^3$, $$ \begin{align} T_v(w) &= v \times w \\ &= \begin{bmatrix}0 & -v_z & v_y \\ v_z & 0 & -v_x \\ -v_y & v_x & 0\end{bmatrix} \begin{bmatrix}w_x \\ w_y \\ w_z\end{bmatrix} \end{align} $$

Mahmoud
  • 1,528