I was reading about Cauchy-Goursat Theorem proof (the version that works with triangles or rectangles) and I couldn't determine when it fails when we want to replicate it in the real world. In this proof, we state:
$$ f(z) = f(z_0) + f'(z_0)(z - z_0) + (z - z_0)\left[ \frac{f(z) - f(z_0)}{z - z_0} - f'(z_0) \right] $$
Then we divide the integral into those three terms. The first and second ones vanish because they have a primitive:
$$ f(z_0) \quad \text{has as its primitive} \quad z f(z_0) $$
$$ f'(z_0)(z - z_0) \quad \text{has as its primitive} \quad \frac{f'(z_0)}{2}(z - z_0)^2 $$
$$ \text{Then the last term, if we define} \text{ } h(z_0) \text{ }as: $$
$$ h(z) = \frac{f(z) - f(z_0)}{z - z_0} - f'(z_0), $$
we can prove that it's bounded and it converges to zero.
I can't find where it fails in the real case. I have read some explanations, and the fact that when we work in higher dimensions of ℝ, the differentiable matrix could be anything were the more convincing ones, but I haven't read a rigorous proof. Can you guys help me with that?