Saturday 20 September 2014

convergence divergence - Prove that the given sequence converges

Prove that the given sequence $\{a_n\}$ converges:



$a_1 > 0, a_2 > 0$




$a_{n+1} = \frac{2}{a_n + a_{n-1}}$ for $n \geq 2$



As I observed, this sequence does not seem to be monotonic and that it could be bounded since the values of $a_1$ and $a_2$ are arbitrary positive numbers.



If the limit of the sequence existed, it would be equal to 1 by letting the limit of $a_n$ be x as n goes to infinity, and solving the equation x = $\frac{2}{x + x}$ => x = 1 or -1, from which we choose x = 1 since x must be positive.



The only idea that came to my mind is bounding the sequence using two other sequences that could be shown to converge to 1 (Let these sequences be $b_n$ and $c_n$):



$b_n <= a_n <= c_n$




If we could find such sequences,and prove that they converge to 1, the problem would be solved. So, I tried to bound the sequence from both sides, and try to show that the limits are equal to 1, but failed to find such sequences. I found that it is a little difficult to analyze sequences of the form presented in the problem since the sequence fluctuates a lot.



I am not sure how to start off, any ideas or tricks for such problems would be appreciated.

No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...