Let's analyze this expression
$\lim_\limits{n\rightarrow\infty} (1+\frac{1}{n})^n$
It's the definition of $e$ which, as we know is not equal to $1$. So what is wrong with the following "logic":
As $\lim_\limits{n\rightarrow\infty}(a_{n}b_{n}) = \lim_\limits{n\rightarrow\infty}(a_{n})\times\lim_\limits{n\rightarrow\infty}(b_{n}) $ and $\lim_\limits{n\rightarrow\infty}(1+\frac{1}{n}) = 1$, we can say that $\lim_\limits{n\rightarrow\infty} (1+\frac{1}{n})^n=1^n$, which is equal to $1$.
I know something's wrong there, but the question is - what?
Answer
What you actually proved is that
$$\lim_{k\to\infty} \left(\lim_{n\to\infty} 1+ \frac1n\right)^k = 1$$
Wich is correct, but the LHS is not equal to $e$.
The problem is most apparent when you end up with a $1^n$ (supposed to be a $\lim_{k\to\infty} 1^k$) and got rid of the limit expression $\lim_{n\to\infty}$.
No comments:
Post a Comment