I hope i understand the question correctly. You have stated that:
So my intuition is that the tensor product is some kind of abstract
space which captures the composition of the individual basis states.
As Ben Grossmann pointed out in the comments, this property is not unique to the tensor product.
In fact the defining property for the tensor product is the universal property which determines the tensor product up to natural isomorphism.
Of course when we want to do specific calculations we have to chose one of the many concrete realizations of the tensor product. Which answers the question:
Computations are carried out, if you consider appropriate isomorphisms?
Now which realization is best suited depends on the context.
There is a special realization for the tensor product of finite dimensional vector spaces when one of them is an inner product space:
Let $V,W$ be real vector spaces of dimension $n,m$.
Consider the map
\begin{align*}
J: V \otimes W &\longrightarrow L(V^\prime , W) \\
\sum_{i=1}^{N} v_i \otimes w_i &\longmapsto \big( \varphi \mapsto \sum_{i=1}^N \varphi(v_i) w_i \big)
\end{align*}
which can be shown to be an isomorphism. Here $V^\prime$ is the dual space of $V$ and $L(V^\prime,W)$ denotes the vector space of linear maps from $V^\prime$ to $W$.
Now if $V$ is an inner product space then we have a natural isomorphism $ V\to V^\prime$ and we can compose the natural isomorphism and $J$ to obtain a natural isomorphism
\begin{align*}
J_2: V \otimes W &\longrightarrow L(V , W) \\
\sum_{i=1}^{N} v_i \otimes w_i &\longmapsto \big( x \mapsto \sum_{i=1}^N \langle x, v_i \rangle w_i \big),
\end{align*}
where $\langle \cdot , \cdot \rangle $ is the inner product.
Now suppose $V = \mathbb{R}^n$ and $W = \mathbb{R}^m$ with the natural inner product.
Now fix $v = \sum_{i=1}^n \alpha_i e_i$ and $w = \sum_{j=1}^m \beta_j e_j$ (where the $\alpha_i , \beta_j \in \mathbb{R}$ and the $e_i$ are the natural basis).
Then we have
$$J_2(v \otimes w)(e_k) = \langle v, e_k \rangle w = \sum_{j=1}^m \alpha_k \beta_j e_j.$$
And so the matrix $(M_{jk}) \in \mathbb{R}^{m \times n}$ of $J_2(v\otimes w)$ in terms of the natural basis is given by $M_{jk}= \alpha_k \beta_j$.
Now applying the tensor to a vector is just plain old matrix multiplication (perfect for writing computer programs).
Note also that $J_2(e_i \otimes e_j) = (\delta_{ji})$ (the Kronecker-$\delta$ matrix).
Of course we could also "flatten" the matrix $(M_{ij})$ to a vector like in your example, but this is less natural, which answers the following question (at least if we want to view the tensor as a linear map):
According to this post , the outer product between two vectors stacks the resulting basis coefficients as matrix and not as vector. Which representation is more suited?
I will supply below a case where we do want to flatten the matrix associated to a tensor.
This realization of the tensor product is also related to the outer product acting on vectors via
$$\otimes_{\mathrm{out}}: W \times V \overset{\otimes}{\to} W \otimes V \overset{tr}{\to} V \otimes W \overset{J_2}{\to} L(V,W) \cong \mathbb{R}^{m \times n},$$
where $tr$ is the transpose map and "$\cong$" means natural isomorphism.
This is exactly the definition of the outer product of two vectors.
Note that the (linearization of the) outer product also just sends a tensor $\sum_{i,j}\alpha_{ij} e_i \otimes e_j$ to its coefficient matrix $(\alpha_{ij}))$, which is perhaps the most obvious way to associate a matrix to a tensor.
The Kronecker product
One case where we do want to flatten the matrix associated to a tensor is the following case:
Let $V,W, X,Y$ be vector spaces and $T : V \to X$, $S: W\to Y$ linear maps. Then there exists a unique linear map $T \otimes S : V \otimes W \to X \otimes Y$ that satisfies $T\otimes S ( v \otimes w) = (Tv) \otimes (Sw)$ for all $v \in V, w \in W$. Note that $S \otimes T$ is just a notation here and (for the moment) not directly related to the actual tensor product map.
Now let $V= \mathbb{R}^n, W = \mathbb{R}^q, X = \mathbb{R}^m, Y = \mathbb{R}^p$ and let $(T_{ij})$, $(S_{ij})$ be the matrix of $T,S$ with respect to the natural basis.
Our goal is to find a matrix multiplication that is equivalent to applying $T\otimes S$ to a tensor.
We know that the tensors of the form $e_i \otimes e_j$ are a basis of both $V \otimes W$ and $X \otimes Y$ (with $i,j$ ranging appropriately and $e_j$ the $j$-th natural basis vector as before).
We can compute
\begin{equation}
\tag{1}
T \otimes S (e_i \otimes e_j) = (T e_i ) \otimes (S e_j) =
\sum_{k} \sum_{l}T_{ki} S_{lj} e_k \otimes e_l,
\end{equation}
which uniquely determines a matrix $(T_{ij}) \otimes_{\mathrm{Kron}} (S_{ij}) \in \mathbb{R}^{pm \times qn}$ so that the following diagram commutes, where $T \otimes_{\mathrm{kron}} S$ is the matrix multiplication with $(T_{ij}) \otimes_{\mathrm{kron}} (S_{ij})$ map and $\mathrm{vec}$ is the "flatten" map:
$$\require{AMScd}
\begin{CD}
V \otimes W @>{T \otimes S}>> X \otimes Y\\
@VV \cong V @VV \cong V \\
\mathbb{R}^{q\times n} @. \mathbb{R}^{p\times m}\\
@VV \mathrm{vec} V @VV \mathrm{vec} V \\
\mathbb{R}^{q n} @>{T \otimes_{\mathrm{kron}} S}>> \mathbb{R}^{p m}\\
\end{CD}$$
It follows from the definition and equation 1 that $(T_{ij}) \otimes_{\mathrm{Kron}} (S_{ij})$ is infact the Kronecker product of $(T_{ij})$ and $(S_{ij})$.
Edit to respond to comments
It can be shown that for any tensor $t \in V \otimes W$ there exist $N \in \mathbb{N}$ and $v_1, \dots , v_N \in V$ as well as $w_1, \dots , w_N \in W$ so that $t = \sum_{i=1}^{N} v_i \otimes w_i$.
This follows from the uniqueness assertion in the universal property and the fact that $\otimes$ is bilinear. Note that here $N$ is some natural number and in general $N \neq \dim V \dim W$.
Furthermore the $v_i$ and $w_i$ are in general not a basis of $V$ resp. $W$.
Of course given $t \in V \otimes W$ there do not in general exist $v \in V, w \in W$ so that $t = v\otimes w$. So in general the sum over different simple tensors is necessary.
Since any tensor can be written like this it makes sense to define a map on this representation of a tensor (as done for $J$). Note that it is actually not obvious that a map defined in this way is actually a well defined function.
To show that $J$ is well defined we actually use the universal property:
Define
\begin{align}
h: V \times W &\longrightarrow L(V^\prime, W )\\
(v,w) &\longmapsto ( \varphi \mapsto \varphi(v) w ).
\end{align}
Then $h$ is bilinear and so the universal property gives us a unique $\tilde{h}: V \otimes W \to L(V^\prime, W)$ linear so that $ h = \tilde{h} \circ \otimes$.
Then as you already discovered $\tilde{h}=J$, which shows that $J$ is well defined (but neither injectivity nor surjectivity).
Now to answer the question about the definition of $J_2$: Let $g: V \to V^\prime$ be the natural isomorphism. Then we can define an isomorphism $F$ by
\begin{align}
F: L(V^\prime , W) &\longrightarrow L(V,W) \\
T &\longmapsto T\circ g
\end{align}
And then we define $J_2 = F \circ J$.