I could solve this problem, but I made a logic jump that I'm not sure how to justify.
We have a segment of length $L$, and two random points with uniform distribution over that segment. That is, $p_{X_{1}X_{2}}(x_{1}, x_{2}) = \frac{1}{L^2}$. The given is the distance between those two points, so $d = |x_{1} - x_{2}|$. I don't know any easy way of dealing with the absolute value, so I took a leap of faith and assumed that $x_{1}$ is the farthest point from $0$, so $d = x_{1} - x_{2}$.
Then I use the delta method:
$$\begin{align} p_{D}(d) &= \iint p_{X_{1}X_{2}}(x_{1}, x_{2})\delta(d - x_{1} + x_{2})dx_{1}dx_{2} \\ &= \int p_{X_{1}X_{2}}(d + x_{2}, x_{2}) dx_{2} \\ &= \int_{-d} ^{L-d}\frac{1}{L^2}\mathbb{1}\{0 < x_{2} < L\}dx_{2} \\ &= \int_{0} ^{L-d}\frac{1}{L^2}dx_{2} \\ &= \frac{1}{L} - \frac{d}{L^2} \end{align}$$ Then: $$\frac{dp_{D}(d)}{dL} = \frac{2d}{\hat{L}^3} - \frac{1}{\hat{L}^2} = 0 \iff \hat{L} = 2d$$
According to my teacher this is right, but I feel like I ignored half of the possible cases by getting rid of the absolute value in $d$.
Why does this yield the same result as if I solved the problem with $d = |x_{1} - x_{2}|$ ?
Thanks.