I am reading An Introduction to the Conjugate Gradient Method Without the Agonizing Pain. On page 20, the author wrote
The convergence results for Steepest Descent are
$$ \lVert e_{(i)} \rVert_A \le \left(\frac{\kappa-1}{\kappa+1}\right)^i \lVert e_{(0)} \rVert_A $$
On page 37, the author points out that
Suppose we wish to perform enough iterations to reduce the norm of the error by a factor of $\epsilon$; that is, $\lVert e_{(i)} \rVert_A \le \epsilon \lVert e_{(0)} \rVert_A$. Equation 28 can be used to show that the maximum number of iterations required to achieve this bound using Steepest Descent is $$ i \le \left\lceil\frac{1}{2} \kappa \ln\left(\frac{1}{\epsilon}\right) \right\rceil $$
This seems trivial but I cannot prove it. Here is my attempt:
$$ \begin{align} \lVert e_{(i)} \rVert_A \le& \epsilon \lVert e_{(0)} \rVert_A \\ \left(\frac{\kappa-1}{\kappa+1}\right)^i \le& \epsilon \\ i \ln\left(\frac{\kappa-1}{\kappa+1}\right) \le& \ln \epsilon \\ i \ln\left(\frac{\kappa+1}{\kappa-1}\right) \ge& \ln \left(\frac{1}{\epsilon}\right) \\ i \ge& \ln\left(\frac{\kappa+1}{\kappa-1}\right)^{-1} \ln \left(\frac{1}{\epsilon}\right) \end{align} $$
How do I proceed?