Consider a series S, for which we would ask the question "does S converge?" I would hypothesise that if a sum does not reach zero 'quickly enough,' its sequence of partial sums will grow according to the increasing number of terms in the partial sum rather than decrease as the magnitude of terms decreases, and it will not converge. Is this strictly correct?
Can someone provide a sketch of a proof for why this is or isn't the case using generic objects, preferably in terms of the arbitrary closeness of partial sums that Cauchy proved? As an example, can we prove rigorously that $$S=\sum_{n=1}^{\infty}\frac{1}{n^p},0<p\leq1$$ doesn't decrease fast enough for this oscillation in partial sums from their convergent to narrow out (see this page for a further elaboration on this)? Can we do it without using the fact that $ln(n)$ grows without bound? Or otherwise, why $p =1+\epsilon$ is a minimum speed for convergence?