1

Recently I was introduced to the Newton-Raphson method for finding roots of a polynomial function. I looked up the proof of it and I found this.

I found a variation of the Newton-Raphson method by considering the first $3$ terms of the Taylor series. i.e, in

$$f(\alpha) = f(x) + (\alpha - x)f'(x) + (\alpha-x)^2\frac{f''(x)}{2!} +\;\; ...$$ We consider

$$f(\alpha) \approx f(x) + (\alpha - x)f'(x) + (\alpha-x)^2\frac{f''(x)}{2!}$$

And as $f(\alpha) = 0$, the above equation turns to

$$0 \approx f(x) + (\alpha - x)f'(x) + (\alpha-x)^2\frac{f''(x)}{2!}$$

And then we solve the above equation for $\alpha$ by using the quadratic formula.

So, my question is will this variation have a worse time complexity for finding the roots of a polynomial function than the original Newton-Raphson method?

  • What do you mean by time complexity? Do you mean the $\min$ number of flops needed to solve the quadratic given the values of $x,f(x),f'(x),f''(x)$? – copper.hat Aug 23 '18 at 14:49
  • Without making additional assumptions about the time complexity of evaluating $f$ and its derivatives, along with even more significant assumptions about the function, it won't be possible to say anything about the time complexity of Newton's method. Except for a few fairly special cases, Newton's method doesn't have polynomial iteration count complexity. In practice we usually analyze the asymptotic convergence rate of Newton and Newton-like methods. – Brian Borchers Aug 23 '18 at 14:51
  • @copper.hat I have updated my question, please take a look at it. – Deepam Sarmah Aug 23 '18 at 15:27
  • Are you asking about the cost of finding a solution within some tolerance or the cost for a single iteration? – copper.hat Aug 23 '18 at 15:33
  • @copper.hat I am asking the cost of finding a solution within some tolerance. – Deepam Sarmah Aug 24 '18 at 14:55
  • I have come across the same idea independently: a "Quadratic-modified N-R method". The usual N-R method can be thought of - particularly when represented graphically- as a linear algorithm - meaning that it uses straight lines to get closer to the root. This usual linear N-R method has quadratic rate of convergence. I would imagine that if your starting value of $x_0$ is close enough to the root, then the rate of your modified algorithm is quartic. But I'm not 100% sure and I haven't proven this yet. And this assumes it is easy/inexpensive to find the second derivative of your function $f(x)$. – Adam Rubinson Jul 21 '20 at 14:53
  • I should have mentioned in my previous comment (although i ran out of characters): your modified algorithm uses quadratic approximations to the curve at $x_n$ to get to $x_{n+1}$, rather than the usual N-R method, which uses straight line approximations to the curve at $x_n$ to get to $x_{n+1}$. I think that's what this question is about. – Adam Rubinson Jul 21 '20 at 15:01
  • Closely related- see: https://en.wikipedia.org/wiki/Muller%27s_method and https://en.wikipedia.org/wiki/Halley%27s_method and https://en.wikipedia.org/wiki/Householder%27s_method and maybe even https://en.wikipedia.org/wiki/Steffensen%27s_method and also https://math.stackexchange.com/questions/1464795/what-are-the-difference-between-some-basic-numerical-root-finding-methods – Adam Rubinson Jul 21 '20 at 15:59
  • Muller's method in particular is quite close to OP's suggested method, in that you're approximating the curve of f(x) by a parabola. – Adam Rubinson Jul 21 '20 at 16:12

0 Answers0