Let $\mu_1$ and $\mu_2$ be two finite measures on $(\Omega, \mathcal{F})$. Let $\mu_1 = \mu_{1a}+\mu_{1s}$ be the Lebesgue decomposition of $\mu_1$ w.r.t. $\mu_2$, that is, $\mu_{1a} \ll \mu_2$ and $\mu_{1s}\perp \mu_2$. Let $\mu = \mu_1 - \mu_2$. I'd like to show that for all $A\in \mathcal{F}$, $$ |\mu|(A) = \int_A |h-1|d\mu_2 + \mu_{1s}(A) $$ where $h = \frac{d\mu_{1a}}{d\mu_2}$ is the Radon-Nikodym derivative of $\mu_{1a}$ w.r.t. $\mu_2$.
I know that we have the following decomposition: $$|\mu|=\mu_+ + \mu_- $$ Here, $\mu_+(A) = \mu(A \cap \Omega_+)$ and $\mu_-(A)=\mu_-(A \cap \Omega_-)$, where $\Omega = \Omega_+ \cup \Omega_-$ is the Hahn decomposition of $\Omega$ w.r.t $\mu$. Also, there exists a finite measure $\lambda$ such that $\mu_1 = \mu_+ + \lambda$ and $\mu_2 = \mu_-+\lambda$ with $\lambda = 0$ iff $\mu_1 \perp \mu_2$. By the definition of Radon-Nikodym derivative, we have $\mu_{1a}(A) = \int_A h d\mu_2$ for all $A\in \mathcal{F}$.
I am unable to use these facts to prove the desired result. Any hint as to how I should proceed would be highly appreciated.
Edit: This problem is exercise 4.13 from the book "Measure theory and Probability Theory" by Krishna B. Athreya and Soumendra N. Lahiri.