I've got a question about mathematical analysis of one-variable functions. Assume that we have a function defined for $x \neq x_0$ as composition/sum/product of differentiable functions and also $f(x_0) = a \in \mathbb{R}$. The task is to designate differentiability of $f$ in the domain (here: $\mathbb{R}$).
It is obvious for $x \neq x_0$. For $x_0$ we could check continuity in $x_0$ and compute one-sided derivatives of $f$ in $x_0$ by definition (if $f$ is continuous obviously). If left-side derivative equals right-side derivative, then $f$ is differentiable also in $x_0$.
Today I've heard about some "extension" of this method - how to determine differentiability without computing one-sided derivatives in $x_0$. Assume $f'$ is a derivative function computed for $x \neq x_0$. If $\lim_{x \to x_0^-} f'(x) = \lim_{x \to x_0^+} f'(x) = g \in \mathbb{R}$, then exists $f'(x_0)$ and also $f'(x_0) = g$. Could anyone sketch me a proof of this?
If we know that $f'(x)$ is continuous in $x_0$, we can obviously state that exists $f'(x_0)$ and it equals the limit of $f'$ in $x_0$, but the condition above is not a condition of continuity (we don't check whether $f'(x_0) = \lim_{x \to x_0} f'(x)$).
Answer
Assuming that $f$ has a limit at $x_0$ (otherwise this is false), you can use mean value theorem to get for $h > 0$, $(f(x + h) - f(x))/h = f'(y_h)$ for some $x < y_h < x + h$. Similarly for the left hand difference quotient.
No comments:
Post a Comment