Wednesday, 10 September 2014

Series: 1 to infinity vs. 1 to n as n approaches infinity



$$
f(x) = \frac{\sum_{i=1}^{\infty}x^i}{1+\sum_{i=1}^{\infty}x^i} = \frac{\sum_{i=1}^{\infty}x^i}{\sum_{i=0}^{\infty}x^i} = x
$$



I haven't taken many math classes (so bear with me if I'm wrong), but I think this is correct, since $$ \frac{\sum \limits_{i=1}^{\infty} \ k^i}{k}=\sum_{i=0}^{\infty}k^i $$




Also, even though both series diverge when $x$ is big enough, I don't think the ratio diverges.



My question has to do with this:
$$
g(x)=\lim_{n\to\infty}{\frac{\sum \limits_{i=1}^{n}\ x^i}{1+\sum\limits_{i=1}^{n} \ x^i}}
$$



If $x>1$, then no matter how high $n$ is, $g$ is never above $1$ since the denominator is larger than the numerator. So, $g(x)$ is not always equal to $x$.




Does a sum from one to infinity mean something different than a sum from $1$ to $n$ as $n$ approaches infinity?



If not, then where was my mistake ($g$ is not equal to $f$)?


Answer



The function $g(x)$ gives no problem, except at $x=-1$. In that sense, it is very different from $f(x)$, which is only defined for $|x|<1$, as explained by Arturo Magidin and Michael Hardy. The calculation you made at the beginning of the post is a bit informal, but is essentially correct for $|x|<1$. (Please see the comment at the end.) And your observation that $g(x)$ cannot always be equal to $x$ is well-founded. You correctly saw that $g(x)$ exists for any positive $x$. Though you did not give a proof, with some work your observation about the fraction being less than $1$ can be turned into a proof. We now solve the problem in detail.



Note that
$$x+x^2+\cdots +x^n=x(1+x+\cdots +x^{n-1})=\frac{x(1-x^n)}{1-x}$$
(if $x\ne 1)$.




So we can find an explicit formula for $g_n(x)$, where
$$g_n(x)=\frac{\sum_{i=1}^{n}x^i}{1+\sum_{i=1}^{n}x^i}.$$
The result (except when $x=\pm 1$) is
$$g_n(x)=\frac{x(1-x^n)}{1-x^{n+1}}.$$
If $|x|<1$, then $\displaystyle\lim_{n\to\infty} x^n=0$, and therefore $\displaystyle\lim_{x\to\infty} g_n(x)=x$. If $|x|>1$, the limit is $1$.



The case $x=-1$ is hopeless, since the denominator is $0$ for all odd $n$. In the case $x=1$, a separate calculation shows that the limit is $1$.



Comment: The case $|x|>1$ is obvious without calculation. If $|x|>1$, and $n$ is large, then the numerator and denominator of $g_n(x)$ each has very large absolute value. But they differ by $1$, so their ratio is nearly $1$. The case $|x|<1$ is also obvious without much calculation. The numerator of $g_n(x)$ is $x(1+x+\cdots +x^{n-1})$, and the denominator is $1+x+\cdots +x^n$. As long as we know that $\displaystyle\lim_{m\to\infty}(1+x+\cdots +x^m)$ exists, and is non-zero, and that $g_n(x)$ is always defined, we can conclude that $g_n(x)$ has limit $x$. We do not need to know a formula for $\displaystyle\lim_{m\to\infty}(1+x+\cdots +x^m)$.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...