Thursday, 4 August 2016

calculus - Reversing the Order of Integration and Summation



I am trying to understand when we can interchange the order of Integration and Summation. I am increasingly encountering Integrals; some of which are being solved by interchanging the order of Summation and Integration, and some which cannot (for no given reason) be solved using this.
Despite looking at a variety of sites, I was unable to understand when we can do so.
$$$$
I came up with the following two requirements here on MSE:$$$$
If$f_n(x)\ge 0$ for all $x,n$

$$\sum \int f_n(x) \, dx = \int \sum f_n(x) \,dx$$
Also if $\sum \int |f_n| < \infty$ or $\int \sum |f_n| < \infty$, then
$$\int \sum f_n = \sum \int f_n$$
Unfortunately I couldn't even understand the notation $f_n(x)$. $$$$
I cannot tell you how grateful I would be if somebody could please explain this to me. Thanks very, very much in advance for any help given.


Answer



The more general question is about interchanging limits and integration. With infinite sums, this is a special case, because by definition $\sum_{n=1}^\infty f_n(x) = \lim_{N \to \infty} \sum_{n=1}^N f_n(x)$. So because one can always interchange finite sums and integration, the only question is about interchanging the limit and the integration.



Writing what I just said in symbols, we want conditions such that




$$\sum_{n=1}^\infty \int_X f_n(x) dx = \int_X \sum_{n=1}^\infty f_n(x) dx.$$



Expanding the definition:



$$\lim_{N \to \infty} \sum_{n=1}^N \int_X f_n(x) dx = \int_X \lim_{N \to \infty} \sum_{n=1}^N f_n(x) dx.$$



Now one interchange is free:



$$\lim_{N \to \infty} \sum_{n=1}^N \int_X f_n(x) dx = \lim_{N \to \infty} \int_X \sum_{n=1}^N f_n(x) dx.$$




The issue is with the last interchange, which is what most of the rest of this answer is about.



The most general result of this type is the Vitali convergence theorem. It says that if $f_n$ is a sequence of measurable functions, $f_n \to f$ pointwise, $f_n$ is uniformly integrable, and $f_n$ is tight, then $\int_X f_n(x) dx \to \int_X f(x) dx.$ (Here $X$ is the set over which we integrate.) You can look up the formal definitions of "uniformly integrable" and "tight" yourself. Roughly speaking they mean that you cannot "compress mass into a point" and that you can't "move mass to infinity". These intuitions are illustrated by the failure of the conclusion of the theorem for the sequences $f_n(x)=\max \{ n-n^2x,0 \}$ on $[0,1]$ and $g_n(x)=\begin{cases} 1 & x \in [n,n+1] \\ 0 & \text{otherwise} \end{cases}$ on the whole line.



The Vitali convergence theorem is general but it is not convenient. The result with perhaps the best balance between generality and convenience to check is the dominated convergence theorem. This says that if $f_n \to f$ pointwise and there is a fixed integrable function $g$ such that $|f_n(x)| \leq g(x)$ for all $n$ and $x$, then $\int_X f_n(x) dx \to \int_X f(x) dx.$



One relatively basic result is the monotone convergence theorem, which says that if $f_n$ is an increasing sequence of nonnegative functions and $f_n \to f$ pointwise, then $\int_X f_n(x) dx \to \int_X f(x) dx$. In particular this holds whether or not $f$ is actually integrable (if it isn't, then the limit of the integrals is $+\infty$). This is also applicable to the case when $f_n$ are nonpositive and decrease to $f$ (this is easy to prove, since $\int_X -g(x) dx = -\int_X g(x) dx$). This is useful for summation, because if $f_n(x) \geq 0$ then $g_N(x)=\sum_{n=1}^N f_n(x)$ is an increasing sequence of nonnegative functions.



Finally in the special case of interchanging summation and integration, one can apply the abstract version of the Fubini-Tonelli theorem. This is because summation can be identified as integration with respect to the counting measure. As a result, if either




$$\sum_{n=1}^\infty \int_X |f_n(x)| dx < \infty$$



or



$$\int_X \sum_{n=1}^\infty |f_n(x)| dx < \infty$$



then one may interchange summation and integration. (This requires a hypothesis about $X$; because this holds for the case of $\mathbb{R}^n$, I won't state it, since this is already a more advanced writeup than you wanted.)


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...