Sunday 18 May 2014

probability - Expectation of a random variable in terms of its distribution function

Here is a theorem on expectation of a random variable in terms of its distribution function




Theroem: Let $X$ be a (continuous or discrete) non-negative random variable with distribution function $F$. Then, $E(|X|) < \infty$ if and only if $\displaystyle \int_0^\infty 1-F(x)dx <\infty$, and in that case,
$$E(X) = \displaystyle \int_0^\infty1-F(x)dx$$.





Then, a corollary of the Theorem is given as:




Corollary: For any random variable $X$, $E(|X|) <\infty $ if and only if the integrals $\displaystyle \int_0^\infty 1-F(x)dx$ and $\displaystyle \int_{-\infty}^0 F(x)dx $ both converge, and in that case
$$E(X) = \displaystyle \int_0^\infty 1-F(x)dx - \displaystyle \int_{-\infty}^0 F(x)dx$$




I understand the Theorem, but I do not see how the Corollary follows from the Theorem. I understand the first claim of the Corollary, but I do not see why

$$E(X) = \displaystyle \int_0^\infty 1-F(x)dx - \displaystyle \int_{-\infty}^0 F(x)dx \tag{1}$$
holds in that case.



I have that:



$$E(|X|) = \displaystyle \int_0^\infty P\{|X| > x\}dx \\
= \displaystyle \int_0^\infty P\{X > x\} + \displaystyle \int_0^\infty P\{X < -x\}dx \\
= \displaystyle \int_0^\infty P\{X > x\} - \displaystyle \int_0^\infty P\{X < x\}dx \tag{2}$$,
but then I could not conclude (1) since the integrand in the second integral of the last line in (2) is $P\{X < x\}$, which is equal to $F(x)$ if X is a continuous random variable, but not equal to $F(x)$ if X is discrete variable.




What am I missing?

No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...