Defintion (Uniform Integrability): A family F of integrable functions is uniformly integrable if ∀ε>0 there is a Mε>0, such that
∫{|f|>Mε}|f|dμ<ε, ∀f∈F
Let (X,A,μ) be a finite measure space. A family of integrable functions F is uniformly integrable if and only if for all ε>0 there exists a δ>0 sucht that for all A∈A we have that
μ(A)<δ ⇒ ∫A|f|dμ<ε, ∀f∈F
Here is my proof and I would like to know if there are errors and would be thankful for any improvements.
"⇒":
Because of uniform integrability we can choose ε/2>0 and get a Mε/2 such that for arbitrary A∈A
∫A|f|dμ=∫{|f|>Mε/2}∩A|f|dμ+∫{|f|≤Mε/2}∩A|f|dμ≤μ(A)Mε/2+∫{|f|>Mε/2}∩A|f|dμ
Note that we have μ(A)<∞ since the measure space is finite. Now we can pick a δ:=ε21Mε/2 from which it follows by the previous equation that for all A satisfying μ(A)<δ
∫A|f|dμ≤ε/2+ε/2≤ε
which is what we wanted to show.
"⇐":
We have, since each member in F is integrable, that
μ({|f|>m})→0 for m→∞ and arbitrary f∈F. Any suggestions on how to prove this statement rigorously?
This is equivalent to (just use the definition of a limit and exchange δ for ϵ)
∀δ>0 ∃Mε∈N ∀n≥Mε:μ({|f|>n})≤|μ({|f|>n})|<δ
which proves this direction.
Answer
The Falrach link identifies an additional requirement sup for the posted condition to imply UI. This additional requirement immediately enables the Markov inequality approach from my hint in comments above. Here is a simple counter-example that shows what can go wrong without that additional requirement:
Counter-example:
Define X=0 (a 1-element set) with \mu(X)=1. For n \in \{1, 2, 3, ...\} define f_n:X\rightarrow\mathbb{R} by f_n(x)=n. Clearly \int_X |f_n|d\mu = n for all n \in \{1, 2, 3, ...\} and so \{f_n\}_{n=1}^{\infty} is not UI. But the functions \{f_n\}_{n=1}^{\infty} satisfy the condition of the above post trivially: For all \epsilon>0 we can choose \delta=1/2 and indeed for any set A \subseteq X that satisfies \mu(A)<\delta we immediately have \int_A |f_n|d\mu<\epsilon. This is because the only subset of X with measure less than 1/2 is the empty set!
So the additional requirement \sup_{f \in \mathcal{F}} \int |f|d\mu is quite needed in general.
A "Covering Property" that implies the additional requirement:
Suppose \mathcal{F} is a family of integrable functions f:X\rightarrow\mathbb{R} such that for all \epsilon>0 there is a \delta>0 such that A \subseteq X with \mu(A)<\delta implies \int_A |f|d\mu < \epsilon for all f \in \mathcal{F}. Now fix \epsilon=1 and choose the corresponding \delta so that \mu(A)<\delta implies \int_A |f|d\mu < 1 for all f \in \mathcal{F}.
If there exists a finite sequence of sets \{A_1, ...,A_m\} (for some positive integer m) such that \cup_{i=1}^m A_i = X, A_i\subseteq X for all i \in \{1, ...,m\}, and \mu(A_i)<\delta for all i \in \{1, ..., m\} then for all f \in \mathcal{F}:
\int_X |f|d\mu \leq \sum_{i=1}^m \int_{A_i}|f|d\mu \leq m
So the additional requirement always holds in this case.
Such a covering by finite sets is always possible when X is a compact subset of \mathbb{R}^k for some positive integer k (and when we use the standard measure for \mathbb{R}^k).
No comments:
Post a Comment