Question: Prove that if un>0 and limn→∞n(unun+1−1)=l , then ∑∞n=1un diverges if l<1.
My Attempt: We choose a positive number ϵ such that l+ϵ<1.
Since limn→∞n(unun+1−1)=l, then for any ϵ>0 , there exists a natural number m such that | n(unun+1−1)−l |<ϵ ∀ n≥m
Therefore, we have
$$l-\epsilon
Any suggestions on how to proceed from here?
P.S. I have read other questions here on MathsSE regarding this proof specifically but I am not satisfied with these answers as one uses big O notation and another has a different form of Raabe's test. I don't want answers with big O notation. Proofs with other methods are welcome but the ones following from where I have ended are most preferred.
Answer
Your first steps are all correct.
You have obtained the inequality
unun+1<rn+1
for n≥m
but have realized that, in this form, it is rather hard to do anything with it.
Here is the classical argument as to how to proceed.
Rewrite this as
nunun+1−(n+1)<r−1≤0.
Then observe that
nun≤(n+1)un+1
for all n greater than m. Thus we have an increasing
sequence {nun} and, in particular there is some positive constant c
for which nun≥c.
Thus for n≥m we have un≥cn
and a comparison with the harmonic series establishes
divergence as you desired.
There is a more general version of Rabbe's test known as
Kummer's test with very much the same proof.
For that you assume there is a sequence of positive numbers
Dn and you compute the limit
L=limn→∞[Dnunun+1−Dn+1]
Raabe's test is just Kummer's test with Dn=n.
The divergence part of Kummer's test (and hence also Raabe's test) doesn't actually require limits. You simply need that Dnunun+1−Dn+1≤0
for all sufficiently large n and the divergence of ∑∞n=11/Dn and you can conclude divergence.
No comments:
Post a Comment