If we have a non-negative-integer valued discrete random variable $N$, as well as independent and identically-distributed random variables $X_1, X_2, \ldots$ (where $N \perp X_i$ as well), consider the sum $$ S = \sum_{i = 1}^{N} X_i $$ If we wanted to find the expectation $\textbf{E}\left[S\right]$, the approach I have seen used is to condition on the particular value of $N$, which is what is done in this question, for example.
However, my question is, doesn't the following simpler derivation suffice? $$ \begin{align*} \textbf{E}\left[S\right] &= \textbf{E}\left[\sum_{i = 1}^{N} X_i\right]\\ &= \textbf{E}\left[N \cdot X_i\right] && \text{Because all $X_i$ are i.i.d.}\\ &= \textbf{E}\left[N\right] \cdot \textbf{E}\left[X_i\right] && \text{Because expectation is multiplicative across independent r.v's} \end{align*} $$This gives us the correct answer, but doesn't require us to condition on the value of $N$; it instead just applies the fact that expectation is multiplicative across independent random variables. Is this approach incorrect? If not, is there a reason why we wouldn't use this approach over conditioning?