I am attempting to prove if the below expression converges or diverges:
$$\sum_{n=2}^{\infty} \frac{1}{n (\ln n)^2}$$
I decided to try using the limit convergence test. So then, I set $f(n) = \frac{1}{n (\ln n)^2}$ and $g(n) = \frac{1}{n}$ and did the following:
$$
\lim_{n \to \infty} \frac{f'(n)}{g'(n)} =
\lim_{n \to \infty} \frac{-\frac{\ln(n) + 2}{n^2 (\ln n)^3}}{-\frac{1}{n^2}} =
\lim_{n \to \infty} \frac{\ln(n) + 2}{n^2 (\ln n)^3} \cdot n^2 =
\lim_{n \to \infty} \frac{1}{(\ln n)^2} + \frac{2}{(\ln n)^3} = 0
$$
I know that $\sum \frac{1}{n}$ diverges due to properties of harmonic series, and so concluded that my first expression $\frac{1}{n (\ln n)^2}$ also must diverge.
However, according to Wolfram Alpha, and as suggested by this math.SE question, the expression actually converges.
What did I do wrong?
Answer
In the limit comparison test, $\sum_n f(n)$ and $\sum_n g(n)$ both will behave the same iff
$$\lim_{n \to \infty} \dfrac{f(n)}{g(n)} = c \in (0,\infty)$$
In your case, the limit is $0$ and hence you cannot conclude anything.
No comments:
Post a Comment