Ok, so looking through some questions I've found this answer: https://math.stackexchange.com/a/1667/102636 containing proof, that there are either inifinitely or no local extrema of continous and nowhere differentiable function. I would love for someone to explain it to me in greater detail (I can't comment on this answer unfortunatelly...)
For recall, the proof goes like this:
Let's take f:[a,b]−>R - continous and nowhere differentiable. Let's assume, f has finite number of extrema, namly c1<c2<....<cn.
Now let's take some ci and ci+1. We take c - global maximum of f on [ci,ci+1] which occurs at an interior point. Therefore c is local extrema of f.
I don't see why this global extremum on [ci,ci+1] needs to be an interior point.
If someone could explain this to me I'd be really gratefull.
Answer
I think the answer was hastily written, and in particular "c1<c2<…f(c2)" needs to be fixed. Hopefully someone with commenting privilege will draw Akhil's attention to this.
Here is another way to phrase it. It is a theorem of Lebesgue that a function that is monotone on [c,D] is differentiable at almost every point of (c,d). Without getting into the technical meaning of "almost every", let's just say that the theorem guarantees f to be differentiable at some point of (c,d). Therefore, a nowhere differentiable function cannot be monotone on any subinterval of its domain.
If such a function had finitely many local extrema, there would be an interval (c,d)⊂[a,b] on which it has no local extrema (you can order all extrema and pick an interval between two consecutive ones). On such an interval f is monotone (this takes a proof), which contradicts the above.
No comments:
Post a Comment