I couldn't find a better title, but basically you have given some values $x_1...x_n$ and some weights $p_1...p_n$ ($x_n\in\mathbb{R}$ and $p_n\in[0,1]$, also $p_1+...+p_n=1$).
You now calculate the weighted arithmetic mean of the squares of this values:
$$W_1=\sum^n_{k=1}x_k^2p_k$$
And also the square of the weighted mean of the values:
$$W_2=\left(\sum^n_{k=1}x_kp_k\right)^2$$
Now I know that $W_1\geq W_2$, but I am not able to prove this. I was only able to transform the inequality a bit so I arrived at:
$$\sum^n_{k=1}x_k^2(p_k-p_k^2) \geq 2\sum^n_{k=1}\sum_{m=k+1}^nx_kx_mp_kp_m$$
I'm really stuck here and don't know how to proceed (or even if this inequality helps me or not).
Background of this question is this well-known formula of the variance for a discrete random variable $X$ (which boils down to the problem I described):
$$V(X)=E(X^2)-(E(X))^2$$
And because $V(X)\geq 0$, it follows $E(X^2)\geq (E(X))^2$. But I tried to find a convincing proof for this statement without using the definition of variance and this formula. Yes, I used Google and Wikipedia but neither could help me.
I hope someone can give me some hints on how to solve this or maybe even give me a complete proof or a reference to one, I would really much appreciate it. :)
Answer
A direct proof would be to observe that for any real number $a$,
$$
0 \le \sum^n_{k=1} (x_k - a)^2 p_k = \sum^n_{k=1} x_k^2 p_k - 2 a \sum^n_{k=1} x_k p_k + a^2 \, .
$$
Then set $a = \sum^n_{k=1} x_k p_k$ to obtain $0 \le W_1 - W_2$.
This is actually the same as the Cauchy–Schwarz inequality,
applied to the vectors
$$
(x_1 \sqrt{p_1}, \ldots, x_n \sqrt{p_n})
$$
and
$$
(\sqrt{p_1}, \ldots, \sqrt{p_n})
$$
No comments:
Post a Comment