When I study the curve integral, I find an interesting problem. There is a curve defined as a continuous function $\gamma:[a,b]\rightarrow \mathbb{R}^n$, the length of $\gamma$ is defined as $$l(\gamma,a,b)=\sup \! \left\{ \sum_{i = 1}^{n} d(\gamma(t_{i}),\gamma(t_{i - 1})) ~ \Bigg| ~ n \in \mathbb{N} ~ \text{and} ~ a = t_{0} < t_{1} < \ldots < t_{n} = b \right\},$$ where $d(x,y)$ is the distance between $x$ and $y$.
Assume $\gamma$ is rectifiable, which means $l(\gamma,a,b)$ is finite. Now I define a function, $f(x):=l(\gamma,a,x)$,where $x\in[a,b]$.
The question is that whether I can conclude $f$ is continuous. My way is to prove that for any $\epsilon$, there exists $\delta$ that when $0<x-x_0<\delta$,$$0<f(x)-f(x_0)<\epsilon.$$ The first inequality is easy to prove. I am stucking in the second inequality, which make me doubt that the proposition is actually false.