Monday 20 April 2015

probability theory - Does there exist a mutivariate inverse?

Let $F:\mathbb{R}\rightarrow \mathbb{R}$ be a distribution function (CDF)



In this case, we can define the inverse $X$ of $F$, and it is a random variable on $(0,1)$ such that $F_X=F$.




Hence, every distribution (CDFs) can be viewed as the cdf of a random variable on $(0,1)$.



Is there an analogous result for joint distribution functions (CDFs)?



That is, for a fixed $n$, does there exists a probability space $(\Omega,\mathscr{F},P)$ such that every joint distribution function $F:\mathbb{R}^n\rightarrow \mathbb{R}$ is $F_X$ for some $n$-dimensional random vector $X$ on $(\Omega,\mathscr{F},P)$?

No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...