For what values of $\alpha >0$ does the series $$\sum_{k=1}^{\infty} \frac{1}{(k+1)\ln(k+1)^\alpha}$$ converge?
In the case that $\alpha = 1$, we have the $$\sum_{k=1}^{\infty} \frac{1}{(k+1)\ln(k+1)}$$ Now we consider the function $$ f(x) = \frac{1}{(x+1)\ln(x+1)}$$ for $f:[1, \infty)\rightarrow \mathbb{R}$. This function is monotonically decreasing and continuous, however, for each index $n$, $$ \int_{1}^{n} f(x)dx = \int_{1}^{n} \frac{1}{(x+1)\ln(x+1)} dx = \ln[\ln(n+1)]-\ln(\ln2)$$ by the First Fundamental Theorem (Integrating Derivatives). This sequence $\ln[\ln(n+1)]-\ln(\ln2)$ is not bounded, and therefore by the integral test, the series $$\sum_{k=1}^{\infty} \frac{1}{(k+1)\ln(k+1)}$$ diverges. Now we will have two more cases where $0<\alpha<1$ and where $\alpha>1$. This is where I get stuck. The ratio and root test are inconclusive from my attempts of using them. I would like to use the comparison test but not sure what I would compare this series too. I was wondering if anyone could please help me out.