I know that:
f=f1+f2:[0,1[→∞
g=g1+g2:[0,1[→∞
Where the limits: lim and \lim_{x\rightarrow 1} g_{1}(x) exist and \lim_{x\rightarrow 1} g_{2}(x)=\infty
I now want to prove that: \lim_{x\rightarrow 1}\frac{f(x)}{g(x)}=\mu \text{ exists }\Leftrightarrow \lim_{x\rightarrow 1} \frac{f_{2}(x)}{g_{2}(x)}=\nu \text{ exists} and then \mu=\nu.
I'm having trouble proving this I tried it using the definition:
\forall \epsilon>0\exists\delta>0 \text{ s.t } |x-1|<\delta\Rightarrow |\frac{f(x)}{g(x)}-\mu|=|\frac{f_{1}(x)+f_{2}(x)}{g_{1}(x)+g_{2}(x)}-\mu|=|\frac{f_{1}(1)+f_{2}(x)}{g_{1}(1)+g_{2}(x)}-\mu|<\epsilon but that didn't really work. Can anyone help me with this?
Answer
Hint: Write {{f_1(x)+f_2(x)}\over {g_1(x)+g_2(x)}}={{{f_1(x)}\over {g_2(x)}}+{{f_2(x)}\over{g_2}(x)}\over{1+{{g_1(x)}\over {g_2(x)}}}}.
lim_{x\rightarrow 1}{{f_1(x)}\over{g_2(x)}}=lim_{x\rightarrow 1}{{g_1(x)}\over{g_2(x)}}=0.
No comments:
Post a Comment