When applying Newton's algorithm, we need to pick an initial guess $x_0$ to start the process.
My question is ... if there is convergence to a root, is this always the root that was closest to $x_0$?
When applying Newton's algorithm, we need to pick an initial guess $x_0$ to start the process.
My question is ... if there is convergence to a root, is this always the root that was closest to $x_0$?
No.
Let $f(x)=x(x-1)(x+1)$. Newton's method starting at $x_0=0.55$ will converge to $-1$ but the root closest to $x_0$ is $+1$.
The basins of attractions for each root are likely to be complicated, as in the Newton fractal.
No. For instance, if the slope is positive at a starting point $x^*$ and $f(x^*)<0 $ then Newton's method will lead us in the positive $x$ direction, although it is entirely possible there is a closer zero in the negative $x$ direction.
The problem is that Newton's Method only looks at the derivative at a given point, but the local information at $x^*$ may have nothing to do with the global information of where is the closest zero.
One nice thing, however, is that as long as the root you are looking for is s simple root, and the function is smooth enough, there is always a certain region around the root in which Newton's Method will converge to that root. But we can't say how big that region of convergence will be in the absence of more information.