Let $V$ and $U$ be two finite-dimensional vector spaces of dimensions $n$ and $m$ respectively. For $f\in \mathrm{End}\,V$ we define a multilinear, anti-symmetric function: $$V\times ...\times V\to \Lambda^nV,~~(v_1, v_2,\dots, v_n)\mapsto fv_1\wedge fv_2\wedge\dots\wedge fv_n$$ what corresponds to a linear function $\det f\colon \Lambda^nV\to\Lambda^nV$.
Now I would like to prove that $\det f\oplus g = \det f\cdot \det g$ (i. e. the determinant of a block diagonal matrix is the product of the determinants of blocks).
Naively, if $v_1, \dots, v_n$ and $u_1,\dots, u_m$ are bases of $V$ and $U$, we merge them into a basis of $V\oplus U$ and: $$ (f\oplus g)v_1\wedge \dots (f\oplus g)v_n\wedge (f\oplus g)u_1\wedge (f\oplus g)u_m = \\=(fv_1\wedge \dots\wedge fv_n) \wedge (gu_1\wedge\dots\wedge g u_m) =\\ =\det f\cdot \det g \cdot v_1\wedge\dots\wedge v_n \wedge u_1\wedge\dots\wedge u_m$$ However, I can't bracket out these two terms without some kind of identification between $\Lambda^{n+m}(V\oplus U)$ and pressumably $\Lambda^nV\otimes \Lambda^mU$. How does one fix this rigorously?
Edit: From Exterior power "commutes" with direct sum we know that there is an isomorphism $\Lambda ^{n+m}(V\oplus U) \simeq \Lambda^nV\otimes \Lambda^mU$.