본문 바로가기

ComputerScience/Probability Theory

Poisson Convergence

Durrett 책을 일부 정리한 내용입니다.


앞으로 증명에 있어서 유용하게 활용되는 lemma 두 개를 소개한다.

Lemma 3.4.3 Let $z_1,\dots,z_n$ and $w_1,\dots,w_n$ be complex numbers of modulus $\leq \theta$. Then

$$| \Pi^n_{m=1}z_m - \Pi^n_{m=1}w_m| \leq \theta^{n-1} \sum\limits^n_{m=1} |z_m - w_m|$$ 

Proof )

Induction을 활용해 증명한다.

$| \Pi^n_{m=1}z_m - \Pi^n_{m=1}w_m| \leq |z_1 \Pi^n_{m=2}z_m - z_1 \Pi^n_{m=2}w_m| + |z_1 \Pi^n_{m=2}w_m - w_1 \Pi^n_{m=2}w_m |  \leq \theta | \Pi^n_{m=2}z_m - \Pi^n_{m=2}w_m| + \theta^{n-1} | z_1 - w_1 |$

$$\tag*{$\blacksquare$}$$

Lemma 3.4.4 If $b$ is a complex number with $|b| \leq 1$ then $|e^b-1-b| \leq |b|^2$

Proof )

Taylor expansion을 활용해 쉽게 증명할 수 있다.

$$\tag*{$\blacksquare$}$$

Theorem 3.6.1 For each $n$ let $X_{n,m},1\leq m \leq n $ be independent random variables with $P(X_{n,m} = 1)=p_{n,m}, P(X_{n,m}=0)=1-p_{n,m}$. Suppose

$ (i) \sum\limits^n_{m=1}p_{n,m} \rightarrow \lambda \in (0,\infty) $

$ (ii) \max_{1\leq m \leq n}p_{n,m} \rightarrow 0 $

If $S_n = X_{n,1} + \dots X_{n,n}$, then $S_n \Rightarrow Z$ where $Z$ is $Poisson(\lambda)$.

Proof ) 

Le $\varphi_{n,m} (t) = E[exp(itX_{n.m})] = 1-p_{n,m} + p_{n,m}e^{it}$

위의 Lemma 3.4.3 을 이용하면

\begin{align} & |exp(\sum\limits^n_{m=1} p_{n,m} (e^{it}-1) - \Pi (1+p_{n,m}(e^{it}-1))|  \\ &\leq \sum\limits^n_{m=1}  |exp(p_{n,m}(e^{it}-1)) - 1- p_{n,m}(e^{it}-1)| \\ &\leq \sum\limits^n_{m=1} p_{n,m}^2 |e^{it}-1|^2 \\ &\leq 4 ( \max_{1\leq m \leq n}p_{n,m} ) \sum\limits^n_{m=1}p_{n,m} \rightarrow 0 \end{align}

여기서 두번 째 inequality 같은 경우 Lemma 3.4.4 를 이용했다.

Q.E.D

Random Variable $X$를 non-negative integer로 아래 theroem 처럼 일반화할 수 있다.

Theorem 3.7.1

Let $X_{n,m}, 1\leq m \leq n$ be independent non-negative integer valued random variables with $P(X_{n,m}=1)=p_{n,m}$, $P(X_{n,m} \geq 2) = \epsilon_{n,m}$

$(i) \sum\limits^n_{m=1}p_{n,m} \rightarrow \lambda \in (0,\infty)$

$(ii) \max_{1\leq m \leq n } p_{n,m} \rightarrow 0$

$(iii) \sum\limits^n_{m=1} \epsilon_{n,m} \rightarrow 0$

If $S_n = X_{n,1} + \dots X_{n,n}$, then $S_n \Rightarrow Z$ where $Z$ is $Poisson(\lambda)$.

Example

$N(s,t)$를 $(s,t]$ 동안의 number of arrivals이라고 하자.

$(i)$ the numbers of arrivals in disjoint intervals are independent

$(ii)$ the distribution of $N(s,t)$ only depends on $t-s$

$(iii)$ $P(N(0,h)=1) = \lambda h + o(h)$

$(iv) P(N(0,h) \geq 2) = o(h) $ 

Theorem 3.7.2 

If $(i)-(iv)$ hold then $N(0,t)$ has a Poisson distribution with mean $\lambda t$.

Proof)

Theorem 3.7.1의 내용을 확인하면 된다.

$N(0,t) = N(0,\frac{t}{n}) + \dots + N(\frac{(n-1)t}{n},\frac{nt}{n})$

(i)부터 체크해보자.

$p_{n,m} = \frac{t}{n}\lambda$이므로

$\sum\limits_m p_{n,m} = \lambda t$이다.

나머지 조건들도 쉽게 확인할 수 있다.

Q.E.D

'ComputerScience > Probability Theory' 카테고리의 다른 글

Doob's Inequality  (0) 2021.12.11
Martingale and Almost sure Convergence  (0) 2021.11.18
Conditional Expectation  (0) 2021.11.17