Monday, 9 September 2019

calculus - Can you take the derivative of a function at infinity?




Exactly the title: can you take the derivative of a function at infinity?



I asked my maths teacher, and while she thought it was an original question, she didn't know the answer, and I couldn't find anything online about this.



Maybe this is just me completely misunderstanding derivatives and functions at infinity, but to me, a high schooler, it makes sense that you can. For example, I'd imagine that a function with a horizontal asymptote would have a derivative of zero at infinity.


Answer



In a very natural sense, you can! If limxf(x)=limxf(x)=L is some real number, then it makes sense to define f(\infty) = L, where we identify \infty and -\infty in something called the one-point compactification of the real numbers (making it look like a circle).



In that case, f'(\infty) can be defined as

f'(\infty) = \lim_{x \to \infty} x \big(f(x) - f(\infty)\big).
When you learn something about analytic functions and Taylor series, it will be helpful to notice that this is the same as differentiating f(1/x) at zero.



Notice that this is actually not the same as \lim_{x \to \infty} f'(x).



These ideas actually show up quite a bit in analytic capacity, so this is a rather nice idea to have.






I wanted to expand this answer a bit to give some explanation about why this is the "correct" generalization of differentiation at infinity. and hopefully address some points raised in the comments.




Although \lim_{x \to \infty} f'(x) might feel like the natural object to study, it is quite badly behaved. There are functions which decay very quickly to zero and have horizontal asymptotes, but where f' is unbounded as we tend to infinity; consider something like \sin(x^a) / x^b for various a, b. Furthermore, \lim_{x \to \infty} f'(x) = 0 is not sufficient to guarantee a horizontal asymptote, as \sqrt{x} shows.



So why should we consider the definition I proposed above? Consider the natural change of variables interchanging zero and infinity*, swapping x and 1/x. Then if g(x) := f(1/x) we have the relationship



\lim_{x \to 0} \frac{g(x) - g(0)}{x} = \lim_{x \to \infty} x \big(f(x) - f(\infty)\big).



That is to say, g'(0) = f'(\infty). Now via this change of variables, neighborhoods of zero for g correspond to neighborhoods of \infty for f. So if we think of the derivative as a measure of local variation, we now have something that actually plays the correct role.



Finally, we can see from this that this definition of f'(\infty) gives the coefficient a_1 in the Laurent series \sum_{i \ge 0} a_i x^{-i} of f. Again, this corresponds to our idea of what the derivative really is.




* This is one of the reasons why I used the one-point compactification above. Otherwise, everything that follows must be a one-sided limit or a one-sided derivative.


No comments:

Post a Comment

real analysis - How to find lim_{hrightarrow 0}frac{sin(ha)}{h}

How to find \lim_{h\rightarrow 0}\frac{\sin(ha)}{h} without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...