Is the following true or false?:
Let f:[0,1)→R be a function differentiable in [0,1) (where the derivative at zero means "right derivative") such that both f and f′ are uniformly continuous in (0,1). Then f′ is continuous.
Note that the mistery lies at x=0. So the question is: can we say with these hypotheses that f′(0)=limx→0+f′(x) (which exists thanks to the uniform continuity of f′∣(0,1)). Note also that the uniform continuity of f′∣(0,1) makes redundant the analogous requirement for f (which will even more become a Lipschitz function).
No comments:
Post a Comment