Monday, 10 October 2016

calculus - Is ex=exp(x) and why?




In the comments to this question a discussion came up wether we have ex=exp(x) by definition and what the "correct" definition of exp(x) is. Building on that, I want to line out the problem with this question and give one way to prove ex=exp(x) for xR.



(Be warned: this is a long post)


Answer



If one first defines e:=limn(1+1n)n and then goes on defining a real-valued function exp:RR, xexp(x):=ex, one has ex=exp(x) by definition but then has to show ex=k=0xkk!. This involves finding the derivative of exp without the (easy) approach using power series. One also needs to define what ex means for xRQ.



If one instead defines e:=limn(1+1n)n and then chooses the approach to define exp(x):=k=0xkk! via power series, we don't get ex=exp(x) by definition and need to prove it. So either way there is a result that needs to be shown.



Of course there is no such thing as right or wrong when it comes to definitions and there is not only one way to define e:=2.7182, another approach being e=((1+1n)n,(1+1n+1)n+1)nN via nested intervalls whereas exp can also be defined as the unique solution to y=y and y(0)=1 and some more. The trouble with different definitions for the same thing is, that one has to prove they are equivalent.




For this answer I assume that one defines e as the limit of the sequence an=(1+1n)n and exp(x)=k=0xkk!. I choose this way because for one that's how I learned it, but I also think that this way has advantages over defining exp(x)=ex as we can directly apply the theory of power series to get properties (being a monotonic function, derivative etc.) of the exponential function. In the following proof I will use several properties of exp such as exp(x)=1exp(x) without proofing them.






We have e:=limn(1+1n)n and further define exp:RR, xexp(x):=k=0xkk!.



exp(x) is well-defined because the series k=0xkk! is absolutely convergent for every xR (this can be shown via the ratio test). We want to show: limx(1+ax)x=ea=exp(a) for all aR.



First we take a look at exp(n) with nZ and show:





exp(n)=en.




Using the binomial theorem we first have \left(1+\frac 1n\right)^n=\sum\limits_{k=0}^{n}\binom{n}{k}\left(\frac{1}{n}\right)^k=\sum\limits_{k=0}^n \frac{1}{k!}\underbrace{\frac{n\cdot (n-1)\cdot \dots \cdot(n-k+1)}{n\cdot n\cdot \dots \cdot n}}_{\leq 1}\leq\sum\limits_{k=0}^n\frac{1}{k!}\leq\exp(1) and thus we have e=\lim\limits_{n\to\infty}\left(1+\frac{1}{n}\right)^n\leq\exp(1).



Now for n>m we get \left(1+\frac{1}{n}\right)^n=\sum\limits_{k=0}^n\binom{n}{k}\frac{1}{n^k}>\sum\limits_{k=0}^m\binom{n}{k}\frac{1}{n^k}=\sum\limits_{k=0}^m\frac{1}{k!}\cdot 1 \cdot \left(1-\frac{1}{n}\right)\cdot\dots\cdot\left(1-\frac{k-1}{n}\right).



On the RHS we have m+1 terms with not more than m+1 factors, thus we can take the limit n\to\infty on both sides and get: e\geq \sum\limits_{k=0}^m\frac{1}{k!} and therefore e\geq\lim\limits_{m\to\infty}\sum\limits_{k=0}^m\frac{1}{k!}=\exp(1). As we have e\leq\exp(1) and e\geq\exp(1) we conclude: e=\exp(1).




Via induction we now get e^n=\exp(n) for all n\in\mathbb N; in the inductive step we use \exp(n+1)=\exp(n)\cdot\exp(1)=e^n\cdot e^1=e^{n+1}. For n\in\mathbb Z,n<0 we then use \exp(n)=\left(\exp(-n)\right)^{-1}=\left(e^{-n}\right)^{-1}=e^n. This proves our first statment. \square



I included this statement and its proof as it needs only the functional equation \exp(x+y)=\exp(x)\exp(y); using more properties of \exp makes this of course much easier.




For all x\in\mathbb R we have: \displaystyle e^x=\exp(x).




For this we use that \exp:\mathbb R\rightarrow (0,\infty) is bijective with \ln as its inverse function. We further use the definition of an arbitrary exponential function: for a>0 the function \exp_a:\mathbb R\rightarrow (0,\infty),~x\mapsto a^x:=\exp(x\cdot\ln(a)) is well-defined.




Thus we get: e^x=\exp(x\cdot\ln(e))=\exp(x). This proves our statement. \square



We now have achieved to show, that \exp(x)=e^x, but this only gives us \exp(x)=e^x=\left(\lim\limits_{n\to\infty}\left(1+\frac{1}{n}\right)\right)^x, so we still have to prove that \lim\limits_{x\to\infty} \left(1+\frac{a}{x}\right)^x=e^a,~a\in\mathbb R.
For a\in\mathbb R we define a function F_a via F_a: D\rightarrow \mathbb R,~F_a(x)=x\ln\left(1+\frac{a}{x}\right)=\ln\left(\left(1+\frac{a}{x}\right)^x\right). As \frac{a}{x}\rightarrow 0 for x\to\infty we can choose D=(\alpha,\infty)\subseteq (0,\infty) and F_a is well-defined. Thus we get: \left(1+\frac{a}{x}\right)^{x}=e^{F(x)}.



We write \displaystyle F(x)=\frac{\ln\left(1+\frac{a}{x}\right)}{\frac{1}{x}} and with \lim\limits_{x\to\infty}\ln\left(1+\frac{a}{x}\right)=\ln(0)=1=\lim\limits_{x\to\infty}\frac{1}{x} and \displaystyle\frac{d}{dx}\frac 1x=-\frac{1}{x^2}\neq 0 for x\in D we apply L'Hospital's rule:
\lim\limits_{x\to\infty} F(x)=\lim\limits_{x\to\infty} \frac{a}{1+\frac{a}{x}}=0.
As \exp is continuous we finally get: \lim\limits_{x\to\infty}\left(1+\frac{a}{x}\right)^x=\exp\left(\lim\limits_{x\to\infty}F(x)\right)=e^a. \blacksquare







A few more words on the definition of \exp as the unique solution of y'=y,~y(0)=1:




Let I\subseteq\mathbb R be an intervall and f:I\rightarrow\mathbb R. Then the following holds:



f is differentiable with f'=\alpha f,~\alpha\in\mathbb R if and only if there exists c\in\mathbb R with f(x)=ce^{\alpha x} for all x\in I.




"\Leftarrow" If f(x)=ce^{\alpha x} we obviously have f'(x)=\alpha ce^{\alpha x}=\alpha f(x).




"\Rightarrow" Let g:I\rightarrow\mathbb R,~x\mapsto e^{-\alpha x}f(x), then g is differentiable with g'(x)=-\alpha e^{-\alpha x}f(x)+e^{-\alpha x} f'(x)=0, thus there exists c\in\mathbb R with g(x)=c=e^{-\alpha x}f(x) \Leftrightarrow f(x)=ce^{\alpha x}.



Now if \alpha=1 we get c=1 from f(0)=c\cdot e^0=c, thus we have proven that this definition of \exp is equivalent as well.


No comments:

Post a Comment

real analysis - How to find lim_{hrightarrow 0}frac{sin(ha)}{h}

How to find \lim_{h\rightarrow 0}\frac{\sin(ha)}{h} without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...