Thursday 25 January 2018

probability - What is the intuition behind the Poisson distribution's function?



I'm trying to intuitively understand the Poisson distribution's probability mass function. When $X \sim \mathrm{Pois}(\lambda)$, then $P(X=k)=\frac{\lambda^k e^{-k}}{k!}$, but I don't see the reasoning behind this formula. In other discrete distributions, namely the binomial, geometric, negative binomial, and hypergeometric distributions, I have an intuitive, combinatorics-based understanding of why each distribution's pmf is defined the way it is.




That is, if $Y \sim\mathrm{Bin}(n,p)$ then $P(Y=k)=\binom{n}{k}p^k(1-p)^{n-k}$, and this equation is clear - there are $\binom{n}{k}$ ways to choose the $k$ successful trials, and we need the trials to succeed $k$ times and fail $n-k$ times.



What is the corresponding intuition for the Poisson distribution?


Answer



Explanation based on DeGroot, second edition, page 256. Consider the binomial distribution with fixed $p$
$$
P(X = k) = {n \choose k}p^k(1-p)^{n-k}
$$



Now define $\lambda = np$ and thus $p = \frac{\lambda}{n}$.




$$
\begin{align}
P(X = k) &= {n \choose k}p^k(1-p)^{n-k}\\
&=\frac{n(n-1)(n-2)\cdots(n-k+1)}{k!}\frac{\lambda^k}{n^k}\left(1-\frac{\lambda}{n}\right)^{n-k}\\
&=\frac{\lambda^k}{k!}\frac{n}{n}\cdot\frac{n-1}{n}\cdots\frac{n-k+1}{n}\left(1-\frac{\lambda}{n}\right)^n\left(1-\frac{\lambda}{n}\right)^{-k}
\end{align}
$$
Let $n \to \infty$ and $p \to 0$ so $np$ remains constant and equal to $\lambda$.




Now
$$
\lim_{n \to \infty}\frac{n}{n}\cdot\frac{n-1}{n}\cdots\frac{n-k+1}{n}\left(1-\frac{\lambda}{n}\right)^{-k} = 1
$$
since in all the fractions, $n$ climbs at the same rate in the numerator and the denominator and the last parentheses has the fraction going to $0$. Furthermore
$$
\lim_{n \to \infty}\left(1-\frac{\lambda}{n}\right)^n = e^{-\lambda}
$$
so under our definitions
$$

\lim_{n \to \infty} = {n \choose k}p^k(1-p)^{n-k} = \frac{\lambda^k}{k!}e^{-\lambda}
$$
In other words, as the probability of success becomes a rate applied to a continuum, as opposed to discrete selections, the binomial becomes the Poisson.



Update with key point from comments



Think about a Poisson process. It really is, in a sense, looking at very, very small intervals of time and seeing if something happened. The "very, very, small" comes from the need that we really only see at most one instance per interval. So what we have is pretty much an infinite sum of infinite Bernoullis. When we have a finite sum of finite Bernoullis, that is binomial. When it is infinite, but with finite probability $np=λ$, it is Poisson.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...