Well, first, as Lie groups, it does not in full generality really make sense to take tensor products. Given a two Lie groups, we take their cartesian product (also known as direct product) which is again Lie group. If $G$ and $H$ are Lie groups, $G\times H$ is a smooth manifold with the product manifold structure, and a group with the product group structure that is compatible with the product manifold structure so it is indeed a Lie group.
Now, let $V$ be a vector space. Then a $\textbf{representation}$ of a Lie group $G$ on $V$, is a smooth homomorphism $\rho:G\rightarrow GL(V)$. Ok, let's unpack that jargon for a second. The object $GL(V)$ is just the general linear group of $V$, that is the group of linear maps $V\rightarrow V$ which are also invertible. If $V$ is a finite dimensional vector space, then after chosing a basis $\{e_i\}$, this just become the general linear group $GL(\mathbb R^n)$, or $GL(\mathbb C^n)$ depending on whether $V$ is a real or complex vector space. These are precisely the groups of invertible $n\times n$ matrices over $\mathbb R$ and $\mathbb C$ respectively. For $\rho$ to be smooth mean's just that, the map is differentiable. Of $\rho$ to be a homomorphism, means that $\rho(g\cdot g')=\rho(g)\cdot \rho (g')$, so $\rho$ respects the group structures of $G$ and $GL(V)$.
As an example, the matrix Lie groups such as $GL(\mathbb R^n)$, $O(n)$, $SO(t,s)$, and $U(n)$ all have obvious representations on the vector spaces $\mathbb R^n$, and $\mathbb C$ (where $s+t=n$), because they are precisely defined to be subsets of the vector spaces of $n\times n$ matrices.
When we have a representation of $G$ on $V$, this means that we can take an element $g\in G$, and element in $v\in V$, and a get a new element $\rho(g)\cdot v\in V$, which we often just denote by $g\cdot v$.
For finite products, the direct product, and the direct sum are the same, but the tensor product is wholly different. With out getting to universal properties and what not (though I suggest you look into that as it makes tensor products, direct sums, and direct products make way more sense) $V\times W$, or $V\oplus W$ are essentially just pairs $(v,w)$, for $v\in V$, and $w\in W$, where vector additions is defined by $(v,w)+(v',w')=(v+v',w+w')$. A tensor product, is however only $\textit{generated}$ by such pairs. That means, we have the pairs $v\otimes w$, and $v'\otimes w'$, but there is, without knowing more about what $v,v',w $ and $w'$ are, not general simplication for the expression $v\otimes w+v'\otimes w'$.
If $V$ and $W$ have basises $\{e_i: 0\leq i\leq n\}$ and $\{f_j:0\leq j\leq m\}$ for some $n$ and $m$ in $\mathbb N$, then a basis for $V\oplus W$ is the set $\{(e_i,0), (0,f_j): 0\leq i\leq n, 0\leq j\leq m\}$, while a basis for $V\otimes W$ is the set $\{e_i\otimes f_j: 0\leq i\leq n, 0\leq j\leq m\}$. In the direct sum, we clearly have $m+n$ basis vectors, but in the tensor product, we have $n\cdot m$ basis vectors. So direct sums create vector spaces of dimension $\dim V+\dim W$, while tensor products create vector spaces of dimension $\dim V\cdot \dim W$.
If we have representations of $G$ on $V$ and $W$, then we have induced representations on $V\oplus W$ and $V\otimes W$ given by $g\cdot (v,w)=(g\cdot v,g\cdot w)$, and $g\cdot (v\otimes w)=(g\cdot v)\otimes (g\cdot w)$.
Now that we have gotten the representation theory out of the way, it is time to address the actual statements (which are often times completely incorrect) that physicists make about Lie groups, and their Lie algebras. To every Lie group, there is a Lie algebra, which is a vector space with with a commutator bracket that is isomorphic to the tangent space at the identity of a Lie group. You can explicitly calculate the Lie algebras of your favorite Lie groups by taking their defining properties, differentiating at the identity, and then applying a dimension argument. For example, if $\gamma$ is a curve in $O(n)$ passing through the identity, such that its tangent vector at $\gamma(0)=I$ is $X$, then we have that:
$$\gamma(t)\cdot \gamma(t)^T=I$$
Taking a time derivative at $t=0$, we have that:
$$X+X^T=0$$
A dimension argument then demonstrates that the Lie algebra of $O(n)$ is the vector subspaces of antisymmetric $n\times n$ matrices.
So, first and foremost, no one means $SU(2)\otimes SU(2)$, I personally don't even know what such a statement would mean, unless we're using the group to refer to a tensor product representation, which I think is bad notational practice. What one would mean is $SU(2)\times SU(2)$.
Secondly, the statement:
$$\mathfrak{so}(1,3)\cong \mathfrak{su}(2)\oplus \mathfrak{su}(2)$$
is completely false. It is also not true that $SO(1,3)\cong SU(2)\times SU(2)$, and even if the statement of Lie algebras was true, this would not imply the statement about the groups, as non isomorphic Lie groups have isomorphic Lie algebras as we are about to see.
What is true, is that the Lie algebra of $SO(t,s)$ is isomorphic to the Lie algebra of the spin group $\operatorname{Spin}(t,s)$ for all signatures of pseudo euclidean inner products. The spin group is the double cover of $SO^+(t,s)$ (that is the special orthogonal matrices which are also time orientable), and what we mean by that is that there is a $2$ to $1$ group homomorphism $\operatorname{Spin}(t,s)\rightarrow SO(t,s)$.
In special cases, we have "exceptional isomorphisms", so for example the spin group of $SO(3)$ is $SU(2)$, so the lie algebras $\mathfrak{so}(3)$ and $\mathfrak{su}(2)$ are isomrophic. The spin group of $SO^+(1,3)$ is $SL(2,\mathbb C)$ so what is true is that $\mathfrak{so}(1,3)\cong \mathfrak{sl}(2,\mathbb C)$. Moreover, confusingly enough in your case, the spin group of $SO(4)$ is $SU(2)\times SU(2)$, so the Lie algebras $\mathfrak{so}(4)$ is isomorphic to $\mathfrak{su}(2)\oplus \mathfrak{su}(2)$. Note that none of these groups are isomorphic to one another, but the Lie algebras are.
As a quick aside, I will sketch the $2$ to $1$ map $SL(2,\mathbb C)\rightarrow SO^+(1,3)$ for you. Identify $\mathbb R^{1,3}$ with a real subspace of $2\times 2$ complex matrices given by the map:
$$v=(t,x,y,z)\longmapsto X_v=\begin{pmatrix}
t+z&x-iy\\
x+iy&t-z
\end{pmatrix}$$
Then $\det X=t^2-x^2-y^2-z^2=\eta(v,v)$, so we define a map:
\begin{align}
\pi:SL(2,\mathbb C)\longrightarrow SO^+(1,3)\\
A&\longmapsto \pi(A)
\end{align}
where $\pi(A)$ acts $v\in \mathbb R^{1,3}$ via:
$$\pi(A)\cdot v= AX_vA^\dagger$$
This clearly preserves the norm on $\mathbb R^{1,3}$, so $\pi(A)$ really is an element in $O(1,3)$, checking that it is orientable and time orientable I leave as an exercise to you, as the computations get a bit hairy. The other $2$ to $1$ maps are defined similarly.
To end the long answer, my guess is that the author is incorrectly stating that $\mathfrak{su}(2)\oplus \mathfrak{su}(2)\cong \mathfrak{so}(1,3)$, because many problems in QFT can be solved in Euclidean signature via a wick rotation, so maybe intuitively there is a nice identification going on? I am no expert on QFT, so I can't be sure, but I know for a fact that that statement about Lie algebras as currently formulated is incorrect.