Saturday, 21 May 2016

probability - Expectation of a continuous random variable explained in terms of the CDF



Problem:



Let $F_X(x)$ be the CDF of a continuous random variable $X$. Show that:



$$E[X]= \int_0^\infty(1-F_X(x)) \, dx -\int_{-\infty}^0F_X(x) \, dx.$$



Attempt:




A comprehensible explanation of the intuition regarding the expectation $E[X]$ and CDF for a non-negative random is found here: Intuition behind using complementary CDF to compute expectation for nonnegative random variables.



However, I am still at a loss about how to show the general case when $-\infty < x < \infty$.



I am again solving this as an exercise in my probability course and any help is greatly appreciated!


Answer



You can also see it by interchanging the order of integrals. We have



\begin{eqnarray*}
\int_{0}^{\infty}(1-F_{X}(x))dx=\int_{0}^{\infty}P(X>x)dx & = & \int_{0}^{\infty}\int_{x}^{\infty}dF_{X}(t)dx\\

& = & \int_{0}^{\infty}\int_{0}^{t}dF_{X}(t)dx\\
& = & \int_{0}^{\infty}t\;dF_{X}(t)
\end{eqnarray*}
and,
\begin{eqnarray*}
\int_{-\infty}^{0}F_{X}(x)dx=\int_{-\infty}^{0}P(X\leq x)dx & = & \int_{-\infty}^{0}\int_{-\infty}^{x}dF_{X}(t)dx\\
& = & \int_{-\infty}^{0}\int_{t}^{0}dF_{X}(t)dx\\
& = & \int_{-\infty}^{0}-t\;dF_{X}(t)
\end{eqnarray*}
Since,

$$
E[X]=\int_{-\infty}^{0}t\;dF_{X}(t)+\int_{0}^{\infty}t\;dF_{X}(t)
$$
the result follows.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...