1

When applying Newton's algorithm, we need to pick an initial guess $x_0$ to start the process.

My question is ... if there is convergence to a root, is this always the root that was closest to $x_0$?

Chach
  • 21

2 Answers2

4

No.

Let $f(x)=x(x-1)(x+1)$. Newton's method starting at $x_0=0.55$ will converge to $-1$ but the root closest to $x_0$ is $+1$.

The basins of attractions for each root are likely to be complicated, as in the Newton fractal.

lhf
  • 221,500
2

No. For instance, if the slope is positive at a starting point $x^*$ and $f(x^*)<0 $ then Newton's method will lead us in the positive $x$ direction, although it is entirely possible there is a closer zero in the negative $x$ direction.

The problem is that Newton's Method only looks at the derivative at a given point, but the local information at $x^*$ may have nothing to do with the global information of where is the closest zero.

One nice thing, however, is that as long as the root you are looking for is s simple root, and the function is smooth enough, there is always a certain region around the root in which Newton's Method will converge to that root. But we can't say how big that region of convergence will be in the absence of more information.

Eric Auld
  • 28,997
  • 1
    it also depends on the sign of $f(x^*)$, doesn't it? :D – user251257 Sep 09 '15 at 13:29
  • Is there a specific way to ensure that we find the $x$ closest to our initial guess? I'm thinking that may be important, since in many cases, we're not looking for any root, but a specific root, right .. ? – Chach Sep 09 '15 at 13:30
  • Newton's Method is not sufficient to accomplish this. It's just that there are so many functions to consider...some of them end up fooling our method!! It would be hard to think of a method that ensures this that looks only at derivatives at a point. But I can't say for certain that one can't exist. You should ask a numerical analyst. – Eric Auld Sep 09 '15 at 13:37
  • 1
    I don't think the simple root hypothesis is necessary. I think you only need that the zero set of the function is discrete. – Ian Sep 09 '15 at 13:40
  • @ian Interesting. I would have thought that a function $f$ such that $f'$ rapidly changes signs (though the zero of $f$ is still isolated) could mess up Newton's method. – Eric Auld Sep 09 '15 at 13:44
  • Unless you actually have some irregularity, I don't think it can be a problem in theory. But the interval could be very small, which could make the method useless in practice. For instance if $f(x)=\sin(1/x)$ and you want the answer of $\frac{1}{N \pi}$ for some huge integer $N$, you will need to start very close by. – Ian Sep 09 '15 at 13:50
  • But for instance, I don't see why there would be a theoretical problem with a function like $\prod_{i=1}^n (x-ih)^2$, even if $h$ is very small and $n$ is very large. – Ian Sep 09 '15 at 13:54
  • Indeed, lhf's link shows that in my preceding example, you will get convergence to $ih$ if you start in the interval $ \left ( ih-\frac{h}{2n},ih+\frac{h}{2n} \right )$. – Ian Sep 09 '15 at 14:20