The crucial insight here is that the map $w \mapsto v \times w$ that represents the (left) cross product of a vector $w$ by a vector $v$ is linear, and since every linear map on a finite-dimensional vector space has a matrix representation (see here for a proof this), then the cross-product operator has a matrix representation. Additionally, because the cross-product operation is anti-commutative, then it follows that the matrix representation of the cross-product is skew-symmetric.
More specifically, consider the finite-dimensional vector-space $(\mathbb R^3, \mathbb R, +, \cdot)$. Given the coordinate vector $v \in \mathbb R^3$, define the operator $T_v : \mathbb R^3 \to \mathbb R^3$ as $T_v(w) = v \times w$,. That is, $T_v$ is the (left) cross-product operation with $v$.
We will now show that $T_v$ is a linear operator. First, fix any $\lambda \in \mathbb R$ and $w \in \mathbb R^3$. Then,
$$
\begin{align}
T_v(\lambda \cdot w) &= v \times (\lambda \cdot w) \\
&= \lambda \cdot (v \times w) \\
&= \lambda \cdot T_v(w)
\end{align}
$$
which follows from the property that the cross-product is compatible with a scalar. Next, fix any $w \in \mathbb R^3$ and $y \in \mathbb R^3$. Then,
$$
\begin{align}
T_v(w + y) &= v \times (w + y) \\
&= (v \times w) + (v \times y) \\
&= T_v(w) + T_v(y)
\end{align}
$$
which follows from the fact that the cross-product is distributive over vector addition. Hence, the operator $T_v$ is linear.
Because every linear map between finite-dimensional vector spaces has a matrix representation, then we would like to determine the matrix representation of $T_v$.
Let $(e_x, e_y, e_z)$ be the standard basis for $\mathbb R^3$. Recall that, for any vector $w \in \mathbb R^3$,
$$
\begin{align}
T_v(w) &= T_v(w_xe_x + w_ye_y + w_ze_z) \\
&= w_xT_v(e_x) + w_yT_v(e_y) + w_zT_v(e_z) \\
&= \begin{bmatrix}T_v(e_x) & T_v(e_y) & T_v(e_z)\end{bmatrix}\begin{bmatrix}w_x \\ w_y \\ w_z\end{bmatrix}
\end{align}
$$
Hence, the matrix representation of $T_v$ is $\begin{bmatrix}T_v(e_x) & T_v(e_y) & T_v(e_z)\end{bmatrix}$, where the first column of the matrix corresponds to the vector $T_v(e_x)$, the second column corresponds to $T_v(e_y)$, and the third column corresponds to $T_v(e_z)$.
We then compute each of these columns as follows:
$$
\begin{align}
T_v(e_x) &= v \times e_x \\
&= -(e_x \times v) \\
&= -(e_x \times [v_xe_x + v_ye_y + v_ze_z]) \\
&= 0e_x + v_ze_y - v_ye_z \\
&= \begin{bmatrix} 0 \\ v_z \\ -v_y\end{bmatrix} \\
T_v(e_y) &= v \times e_y \\
&= -v_ze_x + 0e_y + v_xe_z \\
&= \begin{bmatrix} -v_z \\ 0 \\ v_x\end{bmatrix} \\
T_v(e_z) &= v \times e_z \\
&= v_ye_x - v_xe_y + 0e_z \\
&= \begin{bmatrix} v_y \\ -v_x \\ 0\end{bmatrix}
\end{align}
$$
Hence, the matrix representation of $T_v$ is
$$
\begin{bmatrix}T_v(e_x) & T_v(e_y) & T_v(e_z)\end{bmatrix} = \begin{bmatrix}0 & -v_z & v_y \\ v_z & 0 & -v_x \\ -v_y & v_x & 0\end{bmatrix}
$$
Note that this matrix is skew-symmetric. This is because the cross-product operation is anti-commutative, such that $v \times w = -(w \times v)$ which implies that $(v \times w) + (w \times v) = 0$. This is similar to the definition of a skew-symmetric matrix $A$, such that $A + A^T = 0$.
To summarize, for every $w \in \mathbb R^3$,
$$
\begin{align}
T_v(w) &= v \times w \\
&= \begin{bmatrix}0 & -v_z & v_y \\ v_z & 0 & -v_x \\ -v_y & v_x & 0\end{bmatrix} \begin{bmatrix}w_x \\ w_y \\ w_z\end{bmatrix}
\end{align}
$$