Here is another possible answer to this older post. Suppose $(W,\langle\cdot,\cdot\rangle)$ and $(V,(\cdot,\cdot))$ are inner product spaces over the field $\mathbb{F}$, where either $\mathbb{F}$ is either $\mathbb{C}$ or $\mathbb{R}$ (real Euclidean spaces with the usual dot product for example).
For $w\in W$ and $v\in V$ define the Kronecker product
$w\square v$ as an element of the liner space $L(V,W)$ such that
$$(w\square v)(u):=(u,v)w$$
The operator $w\square v$ has rank $1$ if $w\neq0$, and rank $0$ if $w=0$.
Remark: In many textbooks $w\square v$ is denoted as $v\otimes w$ (notice the reverse order of appearance).
It is easy to check that for any $\alpha\in \mathbb{F}$
- $(\alpha w)\square v=\alpha(w\square v)$
- $w\square (\alpha v)=\overline{\alpha}(w\square v)$
- $(w_1+w_2)\square v=(w_1\square v) +(w_2\square v)$
- $w\square(v_1+v_2)=(w\square v_1)+(w\square v_2)$
For finite dimensional inner spaces, the Kronecker product allows us to define a inner product structure to the space of linear $L(V,W)$. Suppose $\mathcal{V}=(v_1,\ldots, v_n)$ and $\mathcal{W}=(w_1,\ldots,w_m)$ are orthonormal bases for $V$ and $W$ respectively. Any $A\in L(V,W)$ admits a unique matrix representation (based on the bases $\mathcal{V}$ and $\mathcal{W}$) given by
$$a_{ij}=\langle Av_j, w_i\rangle,\qquad 1\leq i\leq m,\, 1\leq j\leq n$$
It is easy to check that
$$A=\sum_{i,j} a_{i,j} (w_i\square v_j)$$
Indeed, if $\tilde{A}:=\sum_{i,j} a_{i,j} (w_i\square v_j)$, then
$$\tilde{A}v_k=\sum_{i,j} a_{ij}(w_i\square v_j)v_k=\sum_i a_{ik}w_j=Av_k$$
therefore $\tilde{A}=A$.
Define on $L(V,W)$ the following inner product
$$\langle A| B\rangle=\sum_{i,j} a_{ij}\overline{b_{ij}}$$
If $M_A$ and $M_B$ represent the matrices of $A$ and $B$ in the bases $\mathcal{V}/\mathcal{W}$, it easy to check that
$$\langle A| B\rangle=\operatorname{trace}(M_A M^*_B)$$
where $M^*_B$ is the transpose complex conjugate matrix of $M_B$, i.e., if $M_B=(b_{ij})$, then $M^*_B=(\overline{b_{ji}})$.
This is indeed an inner product. Furthermore, one can check that $(w_j\square v_j:1\leq i\leq m, 1\leq j\leq m)$ is an orthonormal basis for $\big(L(V,W),\langle\cdot|\cdot\rangle\big)$ and that
$$a_{ij}=\langle Av_j,w_i\rangle=\langle A|w_i\square v_j\rangle$$
Indeed, from
$$\langle(w_i\square v_j)(v_k),w_\ell\rangle=\delta_{kj}\delta_{i\ell},$$
it follows that
$$\langle w_i\square v_j| w_\ell\square v_k\rangle =\delta_{(i,j),(\ell,k)}$$
Observe that the inner product $\langle\cdot|\cdot\rangle$ on $L(V,W)$ depends in principle on the choice of orthonormal bases $\mathcal{V}$ and $\mathcal{W}$. This however is only illusory as in fact, any other choice of orthonormal bases $\mathcal{V}'$ and $\mathcal{W}'$ yield the same inner product. To see this, Let $Q$ and $P$ the matrices on $\mathbb{F}^n$ and $\mathcal{F}^m$ that change bases $\mathcal{V}'$ to $\mathcal{V}$ and $\mathcal{W}'$ to $\mathcal{W}$, that is
if $v=\alpha_1v_1+\ldots +\alpha_nv_n=\beta_1v'_1+\ldots\beta_nv'_n$ and
$w=\delta_1w_1+\ldots +\delta_mw_m=\epsilon_1w'_1+\ldots+\epsilon_mw'_m$, then
\begin{align}
Q(v_{\mathcal{V}'})&=Q[\beta_1,\ldots,\beta_n]^\intercal=[\alpha_1,\ldots,\alpha_n]^\intercal=v_\mathcal{V}\\
P(w_{\mathcal{W}'})&=P[\epsilon_1,\ldots,\epsilon_m]^\intercal =[\delta_1,\ldots,\delta_m]^\intercal=w_{\mathcal{W}}
\end{align}
Since $\mathcal{V},\mathcal{V}'$ and $\mathcal{W},\mathcal{W}'$ are orthonormal bases of $V$ and $W$ respectively, the matrices $Q$ and $P$ are orthogonal, that is $P^*P=PP^*=I_m$ and $Q^*Q=QQ^*=I_n$. Let $M_A$ and $M'_A$ be the matrix representations of $A$ in the bases $\mathcal{V}/\mathcal{W}$ and $\mathcal{V}'/\mathcal{W}'$ repspectively. Then
\begin{align}
\langle Av'_j,w'_i\rangle&=(w'_i)^*_{\mathcal{W}}M_A(v'_j)_{\mathcal{V}}=\big(P\big(w'_j\big)_{\mathcal{W}'}\big)^* M_A\big(Q\big(v'_j\big)_{\mathcal{V}'}\big)\\
&=\big(Pu_i\big)^*M_A\big(Qe_j\big)=u^*_iP^*M_AQe_j
\end{align}
where $u_i$, $1\leq i\leq m$, $e_j$, $1\leq j\leq n$, are the standard orthogonal unit vectors in $\mathbb{F}^m$ and $\mathbb{F}^n$ respectively. Consequently
$$M'_A=P^*M_AQ$$
Putting things together yields
$$\operatorname{trace}(M'_A(M'_B)^*)=\operatorname{trace}(P^*M_AQ(P^*M'_BQ)^*)=\operatorname{trace}(P^*M_AM^*_BP)=\operatorname{trace}(M_AM^*_B)$$
Now we make a connection to the Kronecker product of linear operators (tensor operator). Let $V_1$, $V_2$, $W_1$ and $W_2$ be inner product spaces over $\mathbb{F}$. Given $A\in L(V_1,V_2)$ and $B\in L(W_1,W_2)$, the Kronecker product $A\otimes B$ is an operator in $L\big(L(V_1,W_1),L(V_2,W_2)\big)$ defined as
$$(A\otimes B)(X):=B\circ X\circ A^*$$
where $A^*\in L(V_2,V_1)$ is the adjoint operator related to $A$, that is
$$\langle Ax,y\rangle_{V_2}=\langle x, A^*y\rangle_{V_1},\qquad x\in V_1,\, y\in V_2$$
Suppose $w\in W_1$ and $v\in V_1$. Then, for any $y\in V_2$
\begin{align}
(A\otimes B)(w\square v)(y)&=B\big((w\square v)A^*y\big)=B\big(\langle v,A^*y\rangle_{V_1}w\big)\\
&=\langle v,A^*y\rangle_{V_1}Bw= \langle Av,y\rangle_{V_2}Bw\\
&=\big((Bw)\square (Av)\big)(y)
\end{align}
Hence
$$(A\otimes B)(v\otimes w)= (A\otimes B)(w\square v)=(Bw)\square(Av)=(Av)\otimes(Bw)$$
The case where $V_j$, $W_j$ ($j=1,2$) are the Euclidean spaces with standard dot product is of particular interest in applications. The rest of this posting is dedicated to real Euclidean spaces for simplicity. For any $n$, $\mathbb{R}^n$ is equipped with the standard inner product
$\mathbf{w}\cdot\mathbf{z}=\sum^n_{j=1}w_jz_j$
where $\mathbf{w}=[w_1,\ldots,w_n]^\intercal$ and $\mathbf{z}=[z_1,\ldots,z_n]^\intercal$.
Using the standard orthogonal bases, the space $L(\mathbb{R}^n,\mathbb{R}^m)$ can be identified with $\operatorname{Mat}_{\mathbb{R}}(m, n)$. Suppose $B\in \operatorname{Mat}_\mathbb{R}(\ell,n)$, and $A\in \operatorname{Mat}_\mathbb{R}(k,m)$. Then $A\otimes B$ maps $\operatorname{Mat}_\mathbb{R}(n,m)$ in to $\operatorname{Mat}_\mathbb{R}(\ell, k)$:
$$(A\otimes B)(X)=BXA^\intercal$$
Define $\nu_{nm}:\operatorname{mat}_\mathbb{R}(n,m)\rightarrow\mathbb{R}^{nm}$ as the map
$$\begin{pmatrix}
x_{11} &\cdots & x_{1m}\\
\vdots & \vdots & \vdots\\
x_{n1} & \ldots & x_{nm}
\end{pmatrix} \mapsto \begin{pmatrix}
x_{11}\\
\vdots\\
x_{n1}\\
--\\
\vdots\\
--\\
x_{1m}\\
\vdots\\
x_{nm}
\end{pmatrix}
$$
Clearly $v_{n,m}$ is a bijective function (in fact, it defines a homeomorphism between the inner prodcut spaces $\big(\operatorname{Mat}_{\mathbb{R}}(n, m),\langle\;|\;\rangle\big)$ and $(\mathbb{R}^{nm},\cdot)$).
Let $(w_i:1\leq i\leq n)$ and $(v_j:1\leq j\leq m)$ be the standard orthonormal bases on $\mathbb{R}^m$ and $\mathbb{R}^n$ respectively.
To obtain a matrix representation of $A\otimes B$ in the basis $(w_i\square v_j)$, notice that there is a unique matrix $M_{A\otimes B}\in\operatorname{Mat}_{\mathbb{R}}(k\ell, nm)$ such that
$$(A\otimes B)(X)=\nu^{-1}_{\ell, k}\big(M_{A\otimes B}\,\nu_{m,n}(X)\big)$$
Recall that $A\otimes B(w_i\square v_j)=Bw_i\square Av_i$. In the in the standard bases, the matrix representation of $Bw_i\square Av_i$ is given by
$$M_{Bw_i\square Av_j}=\left(\begin{array}{c|c|c}
a_{1j}\begin{pmatrix}b_{1i}\\ \vdots\\ b_{\ell i}\end{pmatrix}& \cdots &
a_{kj}\begin{pmatrix}b_{1i}\\ \vdots\\ b_{\ell i}\end{pmatrix}
\end{array}\right)
$$
Hence
$$M_{A\otimes B}\nu_{nm}(w_i\square v_j)=M^\intercal_{Bw_i\square Av_j}=\left(\begin{array}{c|c|c}
a_{1j}\begin{pmatrix}b_{1i}\\ \vdots\\ b_{\ell i}\end{pmatrix}& \cdots &
a_{kj}\begin{pmatrix}b_{1i}\\ \vdots\\ b_{\ell i}\end{pmatrix}
\end{array}\right)^\intercal$$
It follows that in the standard bases, $A\otimes B$ has the matrix representation given by the block matrix
$$M_{A\otimes B}=\begin{pmatrix} a_{11}B &\ldots &a_{1m} B\\
\vdots & \vdots &\vdots\\
a_{k1}B &\ldots &a_{km} B
\end{pmatrix}$$
Let $T_{nm}:\operatorname{Mat}_{\mathbb{R}}(n,m)\rightarrow\operatorname{Mat}_{\mathbb{R}}(m,n)$ denote the transpose operator (acting on $\operatorname{Mat}_{\mathbb{R}}(n,m)$).