Poisson Process

1. Counting process

​ We say that {N(t),t0} is a counting process if N(t) represents the total number of "events" occur by time t and satisfies the following:

  1. N(t)0
  2. N(t)N+
  3. N(s)N(t) if st
  4. for s<t, N(t)N(s) equals the number of events that occur in the interval (s,t].

​ A counting process is said to possess independent increments if the numbers of events that occur in disjoint time intervals are independent.

2. Poisson process

1st definition

​ The counting process {N(t),t0} is said to be a Poisson process having rate λ,λ>0, if

  1. N(0)=0

  2. The process has independent increments

  3. The number of events in any interval of length t is Poisson distributed with mean λt. Which means, for any s,t0

    P{N(s+t)N(s)=n}=(λt)neλtn!,n=0,1,...

2nd definition

​ The counting process {N(t),t0} is said to be a Poisson process having rate λ,λ>0 if

  1. N(0)=0
  2. The process has stationary and independent increments
  3. P{N(h)=1}=λh+o(h)
  4. P{N(h)2}=o(h)
Proof of the equivalence of above two definitions

​ First we prove 2nd definition 1st definition. To start, fix u0 and let

g(t)=E[exp{uN(t)}]

​ We derive a differential equation of g(t) as follows:

(1)g(t+h)=E[exp{uN(t+h)}](2)=E[exp{u(N(t+h)N(t))}exp{uN(t)}](3)=g(t)E(exp{u(N(t+h)N(t))})(4)=g(t)E(exp{N(h)})

By the 3rd and 4th condition of the 2nd definition, we can yield

E[exp{uN(h)}]=1λh+euλh+o(h)g(t+h)=g(t)(1λh+euλh)

which implies that

g(t+h)g(t)h=λ(eu1)g(t)+o(h)hlimh0g(t+h)g(t)h=λ(eu1)g(t)g(t)=λ(eu1)g(t)

Solve this differential equation we can get that

g(t)=Cexp{λ(eu1)t}

That is, the Laplace transform of N(t) evaluated at u is eλt(eu1). Since that is also the Laplace transform of a Poisson random variable with mean λt, the result follows from the fact that the distribution of a nonnegative random variable is uniquely determined by its Laplace transform.

3. Interarrival and Waiting Time Distributions

​ Denote the time of the first event by T1, denote the elapsed time between the (n1)st event and the nth event by Tn for n>1.

​ Now we determine the distribution of Tn, note that

(5)P(Tnt)=1P(Tn>t)(6)=1P{N(t)n1}

The event {T1>t} takes place if and only if no events of the Poisson process occur in the interval [0,t] and thus,

P(T1>t)=P(N(t)=0)=eλt

Hence, T1 has an exponential distribution with mean 1/λ. Now,

P(T2>t)=E[P(T2>t | T1)]

However,

(7)P(T2>t |T1=s)=P{N(s+t)N(t)=0 | T1=s}(8)=P{N(s+t)N(t)=0}(9)=eλt

where the last two equations followed from independent and stationary increments. Therefore, we conclude that T2 is also an exponential random variable with mean 1/λ and, furthermore, that T2 is independent of T1. Repeating the same argument yields the following.

Proposition 3.1 Tn=1,2,... are independent identically distributed exponential random variables having mean 1/λ.

​ The arrival time of the nth event is called the waiting time until the nth event.

Sn=i=1nTi,n1

Sn has a gamma distribution with parameters n and λ, the probability of Sn is given by

fsn(t)=λeλt(λt)n1(n1)!,t0

4. Further Properties of Poisson Process

​ Suppose each event occur in a Poisson process is classified as either type I or type II. Suppose further that each event is classified as a type I event with probability p and as a type II event with probability 1p.

​ Let N1(t) and N2(t) denote respectively the number of type I and type II event occurring in [0,t].

Proposition 4.1 {N1(t),t0} and {N2(t),t0} are both Poisson processes having respective rate λp and λ(1p). Furthermore, two processes are independent.

​ Now we prove N1(t) is Poisson process:

  • (10)P{N1(h)=1}=P{N1(h)=1 | N(h)=1}P{N(h)=1}(11)+P{N1(h)=1 | N(h)2}P{N(h)2}(12)=pλh+o(h)

  • P{N1(h)2}P{N(h)2}

Thus we can see that N1(t) satisfies the 2nd definition of Poisson Process with rate λp.


The Coupon Collecting Problem

Problem There are m different types coupons. Each time a person collects a coupon it is, independently of ones previously obtained, a type j coupon with probability pj, j=1mpj=1. Let N denote the number of coupons one needs to collect in order to have a complete collection of at least of one of each type. Find E[N].

Solution Let Nj be the number we must collect to obtain a type j coupon, then we can express N as

N=max1jmNj

let X=max1jmXj denote the time at which a complete collection is amassed. The waiting time of each type is independent and is exponentially distributed with parameter pj

P(X<t)=j=1m(1epjt)

Therefore,

(13)E[X]=0P(X>t) dt(14)=0{1j=1m(1epjt)} dt

Let Ti denote the ith interarrival time of the Poisson process that counts the number of coupons obtained.

X=i=1NTiE[X|N]=NE[Ti]=N

Therefore

E[X]=E[N]


5. Conditional Distribution of the Arrival Times

​ Suppose we are told that exactly one event of a Poisson process has taken place by time t, and we are asked to determine the distribution of the time at which the event occurred. For st

(15)P{T1<s|N(t)=1}=P{T1<s,N(t)=1}P{N(t)=1}(16)=P{N(s)=1,N(t)N(s)=0}P{N(t)=1}(17)=λseλseλ(ts)λteλt(18)=st

Theorem 5.1 Given that N(t)=n, the n arrival times S1,S2,...,Sn have the same distribution as the order statistic corresponding to n independent random variables uniformly distributed on the interval (0,t).

Definition 5.1 We say the Y(1),...,Y(n) are the order statistics corresponding to n random variables Y1,...,Yn if Y(k) is kth smallest value among Y1,...,Ym. If the Yi,i=1,...,n are independent identically distributed continuous random variables with density function f, the joint density of order statistics Y(1),...,Y(m) is given by

f(y1,...,yn)=n!i=1nf(yi),y1<y2<<yn

Proof of Theorem 5.1 Note that the event that S1=s1,S2=s2,...,Sn=sn,N(t)=n is equivalent to the event that the first n+1 interarrival times satisfy T1=s1,T2=s2s1,...,Tn=snsn1,Tn+1>tsn. According to Proposition 3.1, we have the conditional join density of S1,...,Sn given that N(t)=n is as follows:

(19)f(s1,...,sn|n)=f(s1,...,sn,n)P{N(t)=n}(20)=λeλs1λeλ(snsn1)λeλ(tsn)eλt(λt)n/n!(21)=n!tn0<s1<<sn<t

Proposition 5.1 Suppose each time a Poisson event is classified into k types of events. Every type of event occurred at time t, is independent of anything that has previously occurred, with probability Pi(t),i=1,...,k. If Ni(t),i=1,...,k represents the number of type i events occurring by time t then Ni(t),i=1,...,k are independent Poisson random variables having means

E[Ni(t)]=λ0tPi(s)ds

Proof Now consider an arbitrary event that occurred in the interval [0,t]. If it had occurred at time s, then the probability that it would be a type i event would be Pi(s). According to Theorem 5.1, the probability that this event will be a type i event is

Pi=1t0tPi(s)ds

which is independent of the other events. Hence the joint probability

P{Ni(t)=ni,i=1,...,k | N(t)=i=1kni}=(i=1kni)!n1!nk!P1n1Pknk

Consequently,

(22)P{Ni(t)=ni,i=1,...,k}=(ini)!n1!nk!P1n1Pknk(eλt(λt)ini(ini)!)(23)=i=1keλtPi(λtPi))nini!

As said above, Ni(t),i=1,...,k are independent, so random variable Ni(t) with probability function

P{Ni(t)=ni}=eλtPi(λtPi))nini!

satisfies the Poisson distribution with rate λPi, therefore

E[Ni(t)]=λtPi=λ0tPi(s)ds

and the proof is complete.

Proposition 5.2 Given that Sn=t, the set S1,...,Sn1 has the distribution of a set of n1 independent uniform (0,t) random variables.

6. Generalizations of the Poisson Process

6.1 Nonhomogeneous Poisson Process

Definition 6.1 The counting process {N(t),t0} is said to be nonhomogeneous Poisson process with intensity function λ(t),t0. if

  1. N(0)=0
  2. {N(t),t0} has independent increments
  3. P{N(t+h)N(t)2}=o(h)
  4. P{N(t+h)N(t)=1}=λh+o(h)

Proposition 6.1 Let {N(t),t0} and {M(t),t0}, be independent nonhomogeneous Poisson processes, with respective intensity functions λ(t) and μ(t), and let N(t)=N(t)+M(t). Then, the following are true.

  1. {N(t),t0} is a nonhomogeneous Poisson process with intensity function λ(t)+μ(t).
  2. Given that an event of {N(t),t0} process occurs at time t then, independent of what occurred prior to t, the event at t from the {N(t)} process with probability λ(t)λ(t)+μ(t).

6.2 Compound Poisson Process

Definition 6.2 A stochastic process {X(t),t0} is said to be a compound Poisson process if it can be expressed as

X(t)=i=1N(t)Yi,t0

where {N(t),t0} is a Poisson process, and {Yi,i1} is a family of independent and identically distributed random variables that is also independent of {N(t),t0}.

​ We have

(24)E[X(t)]=E[i=1N(t)Yi](25)=n=1E[i=1nYi|N(t)=n]P{N(t)=n}(26)=n=1(E[i=1nYi]P{N(t)=n})(27)=n=1(nE[Y]P{N(t)=n})(28)=E[Y]n=1(nP{N(t)=n})(29)=λtE[Y]

Similarly,

Var(X(t))=λtE[Y2]

6.3 Conditional or Mixed Poisson Processes

​ Let {N(t),t0} be a counting process whose probabilities are defined as follows. There is a positive random variable L such that, conditional on L=λ, the counting process is a Poisson process with rate λ. Such a counting process is called a conditional or a mixed Poisson process.

​ Suppose that L is continuous with density function g. Then

(30)P{N(t+s)N(s)=n}=0P{N(t+s)N(s)=n | L=λ}g(λ) dλ(31)=0eλt(λt)nn!g(λ) dλ

we see that a conditional Poisson process has a stationary increments. However, knowing how many events occur in an interval gives information about the possible value of L, which affects the distribution of the number of events in any other interval, it follows that a conditional Poisson process does not generally have independent increments. Consequently, a conditional Poisson is not generally a Poisson process.

posted @   kaleidopink  阅读(25)  评论(0编辑  收藏  举报
相关博文:
阅读排行:
· 全程不用写代码,我用AI程序员写了一个飞机大战
· DeepSeek 开源周回顾「GitHub 热点速览」
· 记一次.NET内存居高不下排查解决与启示
· MongoDB 8.0这个新功能碉堡了,比商业数据库还牛
· .NET10 - 预览版1新功能体验(一)
点击右上角即可分享
微信分享提示