Friday 29 January 2016

integration - Leibniz's Rule for differentiation under the integral.



If we have




$$F\left( \alpha \right) = \int\limits_a^b {f\left( {\alpha ,x} \right)dx} $$



Then



$$\frac{{F\left( {\alpha + \Delta \alpha } \right) - F\left( \alpha \right)}}{{\Delta \alpha }} = \frac{{\Delta F}}{{\Delta \alpha }} = \int\limits_a^b {\frac{{f\left( {\alpha + \Delta \alpha ,x} \right) - f\left( {\alpha ,x} \right)}}{{\Delta \alpha }}dx} $$



and



$$\mathop {\lim }\limits_{\Delta \alpha \to 0} \frac{{\Delta F}}{{\Delta \alpha }} = \frac{{dF}}{{d\alpha }} = \mathop {\lim }\limits_{\Delta \alpha \to 0} \int\limits_a^b {\frac{{f\left( {\alpha + \Delta \alpha ,x} \right) - f\left( {\alpha ,x} \right)}}{{\Delta \alpha }}dx} $$




However, this doesn't always mean



$$\mathop {\lim }\limits_{\Delta \alpha \to 0} \frac{{\Delta F}}{{\Delta \alpha }} = \frac{{dF}}{{d\alpha }} = \int\limits_a^b {\mathop {\lim }\limits_{\Delta \alpha \to 0} \frac{{f\left( {\alpha + \Delta \alpha ,x} \right) - f\left( {\alpha ,x} \right)}}{{\Delta \alpha }}dx} $$



$$\mathop {\lim }\limits_{\Delta \alpha \to 0} \frac{{\Delta F}}{{\Delta \alpha }} = \frac{{dF}}{{d\alpha }} = \int\limits_a^b {\frac{{\partial f\left( {\alpha ,x} \right)}}{{\partial \alpha }}dx} $$



I know that in other cases, for example in the integration of a series of functions or in sequences of functions, if $s(x)_n \to s(x)$ or $f_n(x) \to f(x) $ uniformly then we can integrate term by term (in the series) or change the order of integration and of taking the limit (in the sequence), i.e:



If




$${s_n}\left( x \right) = \sum\limits_{k = 0}^n {{f_k}\left( x \right)} $$



then



$$\mathop {\lim }\limits_{n \to \infty } \int\limits_a^b {{s_n}\left( x \right)dx} = \int\limits_a^b {s\left( x \right)dx} $$



and for the other case:



$$\mathop {\lim }\limits_{n \to \infty } \int\limits_a^b {{f_n}\left( x \right)dx} = \int\limits_a^b {\mathop {\lim }\limits_{n \to \infty } {f_n}\left( x \right)dx} $$




However Leibniz's rule is used in cases such as:



$$\int\limits_0^1 {\frac{{{x^\alpha } - 1}}{{\log x}}dx} $$



Which isn't even continuous in $[0,1]$. How can we then justify this procedure?



ADD:



One particular example is




$$f(t) = \int\limits_0^\infty {\frac{{\sin \left( {xt} \right)}}{x}} dx =\frac{\pi}{2}$$



Which wrongly yields:



$$f'\left( t \right) = \int\limits_0^\infty {\cos \left( {xt} \right)dx} = 0$$


Answer



Take a look at http://en.wikipedia.org/wiki/Differentiation_under_the_integral_sign



For your integral

$$
\int_0^1 {\frac{{{x^\alpha } - 1}}{{\log x}}dx},
$$
I guess you need $\alpha>1$ (at least to apply the theorem the way it appears in the Wikipedia article). Be careful that $x$ in the article is your $\alpha$.



A more general result is Lebesgue's Dominated Convergence Theorem, where you can replace the continuity assumption with boundedness (since $(x,\alpha)$ will be staying within a rectangle).


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...