Friday 3 January 2020

probability - Showing $int_0^{infty}(1-F_X(x))dx=E(X)$ in both discrete and continuous cases



Ok, according to some notes I have, the following is true for a random variable $X$ that can only take on positive values, i.e $P(X<0=0)$



$\int_0^{\infty}(1-F_X(x))dx=\int_0^{\infty}P(X>x)dx$



$=\int_0^{\infty}\int_x^{\infty}f_X(y)dydx$




$=\int_0^{\infty}\int_0^{y}dxf_X(y)dy$



$=\int_0^{\infty}yf_X(y)dy=E(X)$



I'm not seeing the steps here clearly. The first line is obvious and the second makes sense to me, as we are using the fact that the probability of a random variable being greater than a given value is just the density evaluated from that value to infinity.



Where I'm lost is why:
$=\int_x^{\infty}f_X(y)dy=\int_0^{y}f_X(y)dy$



Also, doesn't the last line equal E(Y) and not E(X)?




How would we extend this to the discrete case, where the pmf is defined only for values of X in the non-negative integers?



Thank you


Answer



The region of integration for the double integral is $x,y \geq 0$ and $y \geq x$. If you express this integral by first integrating with respect to $y$, then the region of integration for $y$ is $[x, \infty)$. However if you exchange the order of integration and first integrate with respect to $x$, then the region of integration for $x$ is $[0,y]$. The reason why you get $E(X)$ and not something like $E(Y)$ is that $y$ is just a dummy variable of integration, whereas $X$ is the actual random variable that defines $f_X$.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...