0

I was doing excercises on convergence of sequences of real analysis but I came up with a problem I don't know how to prove. Note: The book has not shown Cauchy sequences yet and I don't know what they are.

Prove whether or not the sequence $$x_0=a:a>0$$$$x_{n+1}=x_n+\frac{1}{x_n}$$ Is divergent or convergent.

I first wanted to prove that it is convergent by proving that it is monotonuously growing and that it is bounded.

I didn't know how to prove it is bounded, although I believe it's a divergent sequence. I first supposed that it is convergent, and since it is strictly increasing then I could use the fact that the limit of the sequence only depends on it's behavior on the long run to say that

$$lim(x_n)=lim(x_{n+1})$$ and $$x=x+\frac{1}{x} \implies 0 = \frac{1}{x}$$Which is only true if x approaches infinity.

I'm not sure if this works but I would like to know what is wrong with it rather than getting straight my question close right away please.

0 Answers0