I'm reading the book Foundations of Modern Analysis by Avner Friedman and try to solve the exercise1.6.3 which is to prove the Lebesgue outer measure of $[a, b]$ is $b-a$.
My solution is that $(a+\epsilon, b-\epsilon) \subset [a, b] \subset (a-\epsilon, b+\epsilon)$ indicates $\mu^\star((a+\epsilon, b-\epsilon)) \le \mu^\star([a, b]) \le \mu^\star((a-\epsilon, b+\epsilon))$ where $\mu^\star$ denotes the Lebesgue outer measure.
The outer measure of open interval is its length hence $\mu^\star((a+\epsilon, b-\epsilon)) = b-a-2\epsilon$ and $\mu^\star((a-\epsilon, b+\epsilon)) = b-a + 2\epsilon$.
Let $\epsilon \to 0$ we derive that $\mu^\star([a,b]) = b-a$.
My problem is why the author gives a hint to apply Heine-Borel theorem since outer measure possesses the monotone property. Maybe there is something wrong in my proof?
