If $u \in \mathcal{D}'(\mathbb{R})$ where $\mathcal{D}'(\mathbb{R})$ is the space of all tempered distributions defined on $C_c^{\infty}(\mathbb{R})$ and $xu'+u=0$. Show that $u = A \mathbf{p.v.}\frac{1}{x} + B\delta$, where \begin{equation} \mathbf{p.v.}\frac{1}{x}(f(x)) = \lim_{\epsilon \to 0^+} \int_{|x|\ge\epsilon} \frac{f(x)}{x} \mathbf{d}x, \quad f \in \mathcal{D}(\mathbb{R}) \end{equation}
I have already shown that $\mathbf{p.v.}\frac{1}{x}$ is a tempered distribution. For $f \in \mathcal{D}(\mathbb{R})$, we have
\begin{align}
0 &= \langle xu', f \rangle + \langle u, f \rangle \\
&= \langle u', xf \rangle + \langle u, f \rangle \\
&= -\langle u, f+ xf' \rangle + \langle u, f \rangle \\
&= -\langle u, xf' \rangle \\
&= -\langle xu, f'\rangle \\
&= \langle (xu)', f\rangle
\end{align}
Since this holds for any $f$, it must be true that $xu=0$.
But I don't know where to go next.