Let $f\in L^2[0,1]$. $$\int_0^x f(t)dt =0 \quad \forall\, x\in(0,1) \Longrightarrow f=0 \text{ a.e}$$
What is the easiest proof?
Let $f\in L^2[0,1]$. $$\int_0^x f(t)dt =0 \quad \forall\, x\in(0,1) \Longrightarrow f=0 \text{ a.e}$$
What is the easiest proof?
According to the "one-dimensional case" of the Lebesgue differentiation theorem, $$ F(x) = \int_0^x f(t) \, dt $$ is differentiable a.e. on $[0, 1]$ and $F'(x) = f(x)$.
In your case $F(x) = 0$ for all $x \in (0, 1)$, therefore $f(x) = F'(x) = 0$ a.e.
One needs only to assume that $f \in L^1[0,1]$. First of all,
$$\tag{1} \int_a^b f(x) dx = \int_0^b f(x) dx - \int_0^a f(x) dx = 0.$$
So if you use Lebesgue differentiation theorem, for a.e. $x\in [0,1]$,
$$f(x) = \lim_{\delta \to 0} \frac{1}{\delta} \int_x^{x+\delta} f(s) ds = 0.$$
However, the result you want to show is much more elementary than Lebesgue differentiation theorem. Using (1) and Lebesgue's dominated convergence theorem, we have
$$\int_U f(x) dx = 0$$ for all open set $U$. Thus the same is true for all closed set. Using Lebesgue's dominated convergence theorem again, it's true for all Borel sets. Now consider $$E_+ = \{ f(x) >0\}.$$ Then there is a Borel set $B \subset E$ so that $m(E\setminus B) = 0$. So $$\int_{E^+} f(x) dx =0\Rightarrow f\le 0 \text{ a.e.}$$ Doing the same for $E^-$, we have $f= 0$ a.e. .
Hint:
$$f(x)=\frac d{dx}\int_0^x f(t)dt =\frac{d}{dx}I(x).$$