I'm trying to find two random variables that are not independent -
there exist a and b such that P(X=a∧Y=b)≠P(X=a)P(Y=b)
but, E[XY]=E[X]E[Y]
I'm trying to understand some of the examples of this that I have seen, but they almost always define two random variables X and Y either in terms of each other, or in terms of a third random variable. This seems to be at odds with the formal definition of a random variable that I am accustomed to. That is, if we have a probability space (Ω,P), then a random variable X is a mapping X:Ω→R (or some other space, but let's use R for simplicity).
So to say something like Ω={−1,0,1} P is uniform, X(a)=a for all a∈Ω and Y=X2 doesn't seem like an example, since Y does not formally meet the definition of a random variable.
I'm also unsure of what values I would even sum over in computing E[XY] for this case. any help would be great.
Answer
Your example is fine. Y is a random variable, because Y is a mapping from the probability space to R given by Y(ω)=X2(ω)=ω2 for ω∈Ω. Thus most of the examples you have seen hold.
No comments:
Post a Comment