Let $X$ be a non-negative random variable and $F_{X}$ the corresponding CDF. Show,
$$E(X) = \int_0^\infty (1-F_X (t)) \, dt$$
when $X$ has : a) a discrete distribution, b) a continuous distribution.
I assumed that for the case of a continuous distribution, since $F_X (t) = \mathbb{P}(X\leq t)$, then $1-F_X (t) = 1- \mathbb{P}(X\leq t) = \mathbb{P}(X> t)$. Although how useful integrating that is, I really have no idea.
Answer
For every nonnegative random variable $X$, whether discrete or continuous or a mix of these,
$$
X=\int_0^X\mathrm dt=\int_0^{+\infty}\mathbf 1_{X\gt t}\,\mathrm dt=\int_0^{+\infty}\mathbf 1_{X\geqslant t}\,\mathrm dt,
$$
hence
$$
\mathrm E(X)=\int_0^{+\infty}\mathrm P(X\gt t)\,\mathrm dt=\int_0^{+\infty}\mathrm P(X\geqslant t)\,\mathrm dt.
$$
Likewise, for every $p>0$, $$
X^p=\int_0^Xp\,t^{p-1}\,\mathrm dt=\int_0^{+\infty}\mathbf 1_{X\gt t}\,p\,t^{p-1}\,\mathrm dt=\int_0^{+\infty}\mathbf 1_{X\geqslant t}\,p\,t^{p-1}\,\mathrm dt,
$$
hence
$$
\mathrm E(X^p)=\int_0^{+\infty}p\,t^{p-1}\,\mathrm P(X\gt t)\,\mathrm dt=\int_0^{+\infty}p\,t^{p-1}\,\mathrm P(X\geqslant t)\,\mathrm dt.
$$
No comments:
Post a Comment