If $a\ge b\ge c > 0$ then
$$ \begin{split} \frac{(a-b)^2}{(a+b)} + \frac{(b-c)^2}{(b+c)} & \ge \sqrt{3(a^2+b^2+c^2)}- (a+b+c) \\&\ge \frac{(a-b)^2}{\frac{1+\sqrt{3}}{2}a+\frac{5-\sqrt{3}}{2} b} +\frac{(b-c)^2}{(1+\sqrt{\frac{3}{2}})b+(2-\sqrt{\frac{3}{2}}) c} \end{split} $$
This is a strengthening of an inequality found here.
Notes:
Found the RHS inequality by choosing the coefficients in such a way that (after a substitution $a=u+v+w$, $b=u+v$, $c=u$), the coefficients of all of the monomials in $u$, $v$, $w$ in the numerator of the difference of the squares of sides are $\ge 0$ ( also known as the Buffalo way, but it goes back at least to Adolf Hurwitz). The possible coefficients in the denominator form a convex set of a fairly simple kind ( a product in fact). The choices are those of a vertex. One can do that for every $n$, the formulas are not that complicated.
Found the LHS inequality by numerical optimization. In fact, one cannot do it with the previous method.
A general comment: one could compose many inequalities about the extreme of some algebraic function on some semialgebraic set. In a fixed dimension the answer can be given in principle as a real algebraic number ( if not infinite). For inequalities that involve $n$ unknowns, where $n$ is arbitrary, things can be complicated. Some , ( the mean inequalities, or the Newton/Maclaurin inequalties $\&$c) are established for all $n$. It may not be feasible to do that in general.
At this stage I don't see a neater way to prove it. Any feedback would be appreciated!
In a previous version the LHS was not correct, I thank @River Li: for pointing out that.