Wednesday, 25 November 2015

probability - Let Xn be the n-th partial sum of i.i.d. centralized rv and mathcalFm:=sigma(Xn,nlem), then textE[XnmidmathcalFm]=Xm



Let





  • (Ω,F,P) be a probability space

  • (Yi)iN be a sequence of i.i.d. random variables (Ω,F)(R,B(R)) with E[Yi]=0 and Xn:=Y1+Yn

  • Fm:=σ(Xn,nm) be the smallest σ-Algebra such that X1,,Xm are measurable with respect to Fm

  • E[XnFm] denote the conditional expectation of Xn given Fm



Maybe it's cause there are too many new concepts for me (conditional expectation, filtrations, ...), but I don't understand why we've got $$\operatorname{E}\left[X_n\mid\mathcal{F}_m\right]=X_m\;\;\;\text{for all }m

Answer




As @aerdna91 pointed out, the identity



E(XnFm)=Xm



holds only for mn. For m>n, we have



E(XnFm)=Xn.



To prove (1), we consider the case m=n1. Then, as Xn=Xn1+Yn,




E(XnFn1)=E(Xn1Fn1)Xn1+E(YnFn1).



Now, since Fn1 and Yn are independent, the second term equals



E(YnFn1)=E(Yn)=0.



Hence, we have shown that



E(XnFn1)=Xn1.




Now (1) follows by iterating this procedure.



Remark: The proof shows that (Xn,Fn)nN is a martingale.


No comments:

Post a Comment

real analysis - How to find limhrightarrow0fracsin(ha)h

How to find lim without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...