Suppose X:Ω→R is a random variable on (Ω,F,P) and X′:Ω′→R is a random variable on (Ω′,F′,P′). Assume that PX=P′X′ (the distribution of X is equal to the distribution of X′). Is it true that
∫ΩXdP=∫Ω′X′dP′
I.e. is the P-expectation of X equal to the P′-expectation pf X′?
Intuitively, this ought to be true but how can I formally show this?
I tried the approach where you first show this for indicatorfunctions, then for positive functions etc but this doesn't work because we work on different probability spaces.
Maybe I can argue in the following way, if X≥0:
∫ΩXdP=∫∞0P(X≥t)dt=∫∞0P′(X′≥t)dt=∫Ω′X′dP′
and in the general case, the result then follows if we can prove that X+=XI{X≥0} and (X′)+=X′I{X′≥0} have equal distribution (and similarly for X− and (X′)−.
Any ideas?
Answer
EX=∫RxdPX(x) and EX′=∫RxdPX′(x), so the answer is YES. EX exists iff EX′ exist and they are equal when they exist.
No comments:
Post a Comment