Given a function $f(x)$, such that for some $ b > a $, the following holds: $f(b) - f(a) = 0$
Is the following true?
For any $s < (b - a)$, there exist $c$ and $d$ (where $d > c$ and $d - c = s$) such that: $f(d) - f(c) = 0$
In my claim I'm assuming that f(x) is continuous function and c or d or (c and d both) can be outside a-b interval
informal claim: If a stock’s total net growth over a 25-year period is zero, then there must be at least one interval of length 20 years (and 15 years, 10 years, 5 years, etc.) over which net growth is also zero.