I'm given function f:(a,∞)→R which has a limit at infinity, i.e., limx→∞f(x) exists, call it L. And I want to show that given a function g(x):=f(1/x), which is defined on (0,1/a), that this function g(x) has a limit at 0 if and only if the limit of f as x tends to infinity exists.
I know I have to use the ϵ−δ defintion, but before that I think the following is an equivalent formulation:
limx→∞f(x)=limx→0f(1/x).
I know this is just an exercise in chasing the ϵ−δ notation, but I think the "trick" here is to use the fact that if f has a limit at infinity, then for all ϵ>0, there exists M>a such that for all x≥M we have that |f(x)−L|<ϵ. So I think the idea here is to pick my δ as 1/M since we have that
x≥M⟹1/x≤1/M
and we know that if x≥M then |f(x)−L|<ϵ. So if we suppose ϵ0>0 and that |f(1/x)−L|<ϵ0 will δ0=1/M suffice? My intuition says yes, but I am not sure how to formulate this rigorously.
No comments:
Post a Comment