I'm having a bit of trouble understanding uniform continuity in "plain English" (for the lack of a better term).
The definition of continuity I'm working with is this:
Let $f: A\ \subseteq \mathbb{R} \rightarrow \mathbb{R}$.
We say that $f$ is continuous at $x_o \in A$ if, for every $\epsilon > 0$, there is some $\delta > 0$ s.t. if $|x - y| < \delta$, then $|f(x) - f(x_0)| < \epsilon$.
If I'm understanding this correctly, then continuity at $x_0$ means that we can draw any "box" of dimensions $\delta$ x $\epsilon$ such that every conceivable $f(x)$ will lie within that box (and this box can be arbitrarily small).
$f$ is continuous over $(a, b)$ if the above definition holds for all values within that interval.
But, the definition of uniform continuity is confusing to me. The definition I'm working with is this:
Let $f: A \subseteq \mathbb{R} \rightarrow \mathbb{R}$.
We say that $f$ is uniformly continuous if for all $\epsilon > 0$, there is some $\delta = \delta (\epsilon) > 0$ s.t. if $x, y \in A$ with $|x - y| < \delta$, then $|f(x) - f(y)| < \epsilon$.
I'm having a lot of trouble understanding what this definition is saying. First, why are we setting $\delta$ to $\delta * \epsilon$? Secondly, if possible, how do we put the above definition in terms of the "box" analogy (or some better analogy)?
In what ways is uniform continuity different from continuity? Any help understanding unif. continuity and the difference between these 2 definitions would be greatly helpful. Thank you.
No comments:
Post a Comment