As we know that Newton’s Method is an algorithm to solve $g(x) = 0$. Given an initial $x_0$, we compute a sequence of points defined via
\begin{equation}
x_{n+1} = x_{n} - \dfrac {g(x_n)}{g'(x_n)}
\end{equation}
Your first question is for what values of initial approximation $x_0$ does the newton's method converges quadratically?Here is the result of convergence of Newton's method you would like to read.
Let $g$ be twice continuously differentiable
on the interval (a, b) . Let $r$ be the root of $g$. If $r\in (a,b)$ such that $g(r) = 0$ and $g'(r)\neq 0$, then there exists $\delta > 0$ such that Newton’s Method will converge if started in the interval [r -$\delta$, r+$\delta$]. In this case,
the sequence converges quadratically.
Case when Newton's method failed to converge quadratically: Consider $g(x) = x^2$. Now the question is will Newton’s Method converge quadratically to the root $x = 0$? Answer is no: This happened because there was a multiple root at $x = 0$. Note that In Newton’s Method if the root being sought has multiplicity greater than one, the convergence rate is merely linear (errors reduced by a constant factor at each step) unless special steps are taken (you would like to read This).
If you believe your function has a “simple” root (that is, $g(r) = 0$ and $g^{'}(r) \neq 0$), and you have a reasonable starting guess, $x_0$, then Newton’s Method is the method of choice. In practice, we will not know if we have a simple root, and we may not have a very good starting estimate- In that case, you may use Bisection method to get you started, then switch to Newton’s Method once you’re close.