Setting
Let $(X_i)_{i\leq N}$ be a set of i.i.d. random variables, with $X_i$ mapping to some interval $[a,b]$.
Let $Y_{k:N}$ be the $k$th order statistic of this set and $v\in[a,b]$.
Denote by $f_X,F_X$ the continuous pdf and the continuous CDF of $X_i$ and by $f_{Y_{k:N}}$ the pdf of $Y_{k:N}$
Quantity of interest
I am interested in the truncated expectation of the order statistic $$E[Y_{k:N}|Y_{k:N}>v].$$
This can be written as $$E[Y_{k:N}|Y_{k:N}>v]=\frac{\int_v^\infty yf_{Y_{k:N}}(y)dy}{\int_v^\infty f_{Y_{k:N}}(y)dy}.$$
Conjecture
Computing this quantity in MATLAB, suggests that
$$E[Y_{k:N}|Y_{k:N}>v]\underset{N\rightarrow\infty}{\rightarrow}v.$$
Also my intuition is in line with this conjecture: For growing $N$, the support of $f_{Y_{k:N}}$ shrinks to a small region and we can predict $E[Y_{k:N}|Y_{k:N}>v]$ better. Furthermore, the probability of the next value being close to $v$ is large.
However, I am missing a formal proof.
Any ideas?
Answer
We need to assume something. Assume $E|X| < \infty$ and $F(v)$ is increasing, such that for all $u>v$, $F(u) > F(v)$
For $u > v$ we have,
$$
P(Y_{k:n} > u | Y_{k:n} > v) = \frac{P(Y_{k:n}>u)}{P(Y_{k:n}>v)}.
$$
Now $P(Y_{k:n}>x)$ is asking for the probability that out of $n$ tries at most $k-1$ of the $X_i$ is below or equal to $x$. So if $N_{n,x} \in Bin(F(x),n)$ (binomial distributed) we have,
$$
P(Y_{k:n}>x) = P(N_{n,x} < k).
$$
Now this probability is decreasing in $x$ and it is not hard to see that we can write for a fixed $k$,
$$
P(N_{n,x} < k) = C(x,n)n^{k-1}(1-F(x))^{n-k},
$$ with $C(x,n)
Hence,
$$
\frac{P(Y_{k:n}>u)}{P(Y_{k:n}>v)} = \frac{P(N_{n,u}
$$ if $F(v) > 0$, with $0\leq p <1$ due to the fact that $F(v)$ is monotonically increasing at $v$. Hence, this goes to zero as $n$ goes to infinity. If $F(v)=0$, then we note, $E(Y_{k:n}|Y_{k:n}>v)=E(Y_{k:n})$ and it enough to observe that still we have $C(x,n)
P(Y_{k:n}>u) = P(N_{n,u}
This shows the most probability mass lies at $v$ so expectation over any finite region above an $u$ will have a value that goes to zero and because of $E|X|$ is finite, the tail goes to zero and we are left with essentially a delta measure on $v$ and the expectation is indeed $v$.
No comments:
Post a Comment