Stochastic Process 2 : HSTS417
ASSIGNMENT 3
SURNAME NAME(S) REGISTRATION NUMBER PROGRAMME
MABURUSE WHITNEY R196179V STSHSC
MHELE MIGUEL R195049P HMTHEC
MUSAGWIZA TRESHAM R196191U STSHSC
KUMBULA PANASHE R195201H HMTHEC
GWENGWE PROSPER TAKURA R195090U HMTHEC
NCUBE NKOSINHLE M R195065L HMTHEC
NIKISI MUTSAWASHE R181705K HASC
MAINDIDZE DOMINION R195150E HMTHEC
MASAMVU CLIBERT PANASHE R195019N HASC
GAMBIZA CHARLES R195018V HMTHEC
NDLOVU PARDINGTON R181679C HASC
KEMBO MUNASHE R1811764 HASC
1. Definition of terms
• (a).Poisson Process:
The counting process N (t), t ≥ 0 is a Poisson process if:
1. N (0) = 0,
2. It has independent and stationary increments,
3. The number of events in any interval of length t is Poisson distributed with
mean λt:
e−λt (λt)n
P (N (t + s) − N (s) = n) = , n = 0, 1, 2, . . .
n!
• (b).Homogeneous Poisson Process:
A Poisson process is homogeneous if the number of points counted in an interval
(t, t + s) is given by a Poisson distribution with parameter λs.
• (c).Non-Homogeneous Poisson Process:
A Poisson process is non-homogeneous if:
e−R(s,t) (R(s, t))n
P (N (t + s) − N (s) = n) = , n = 0, 1, 2, . . .
n!
R s+t
where R(s, t) = s
λ(y) dy.
• (d).Compound Poisson Process:
A stochastic process X(t), t ≥ 0 is a compound Poisson process if:
N (t)
X
X(t) = Yi
i=1
where N (t) is a Poisson process with rate λ, and Yi are i.i.d. random variables
independent of N (t).
2.Queuing Problem
(a): No One Waiting
Problem: Customers are served by three servers, with service times exponentially dis-
tributed with rate µi , i = 1, 2, 3. When a server is free, the longest-waiting customer
begins service.
(a) Expected Departure Time When All Servers Are Busy and
No One Is Waiting
When you arrive, all three servers are busy with customers currently being served. You
must wait until the first server becomes free, then receive service.
Since service times are exponential with rates µ1 , µ2 , and µ3 , the remaining service
times at each server are also exponentially distributed with the same rates due to the
memoryless property.
The time until the first server becomes free is the minimum of three independent
exponential random variables:
Tmin = min(X1 , X2 , X3 ) (1)
1
where Xi ∼ Exp(µi ).
The minimum of independent exponential random variables is also exponential with
rate equal to the sum of the individual rates:
Tmin ∼ Exp(µ1 + µ2 + µ3 ) (2)
Therefore, the expected waiting time until a server becomes free is:
1
E[Twait ] = (3)
µ1 + µ2 + µ3
Once a server becomes free, you begin service. The server that becomes free first is
the one with the highest rate (fastest service), which occurs with probability proportional
to its rate. However, by symmetry and the properties of the exponential distribution,
your expected service time is:
1
E[Tservice ] = (4)
µ1 + µ2 + µ3
Wait, let me reconsider this. When you get service, you get served by whichever server
became free first. The expected service time depends on which server serves you.
Actually, let me approach this differently. The total expected time consists of: 1.
Waiting time until a server becomes free: µ1 +µ12 +µ3 2. Your service time at the server
that became free
Since you’re served by the server that becomes free first, and this happens randomly
based on the exponential race, the expected total time is:
1
E[Ttotal ] = + E[service time] (5)
µ1 + µ2 + µ3
For a more precise calculation, when server i becomes free first (with probability
µi
µ1 +µ2 +µ3
your service time is µ1i .
),
Therefore:
3
X µi 1 3
E[service time] = · = (6)
i=1
µ1 + µ2 + µ3 µi µ1 + µ2 + µ3
Thus, the total expected departure time is:
1 3 4
E[Ttotal ] = + = (7)
µ1 + µ2 + µ3 µ1 + µ2 + µ3 µ1 + µ2 + µ3
(b) Expected Departure Time When All Servers Are Busy and
One Person Is Waiting
When you arrive, all servers are busy and one person is already waiting ahead of you.
This means you are second in the queue.
You must wait for: 1. The current waiting customer to be served (when the first
server becomes free) 2. Your own service (when the next server becomes free)
The first customer ahead of you will wait for time µ1 +µ12 +µ3 and then be served for an
expected time of µ1 +µ32 +µ3 (as calculated above).
2
After the first waiting customer starts service, you become the next in line. You then
wait for the next server to become free, which takes an expected time of µ1 +µ12 +µ3 , and
then receive service for an expected time of µ1 +µ32 +µ3 .
However, we need to be more careful about the timing. Once the first waiting customer
starts service, there are still three servers working (the original three busy servers continue
their work, and one of them now serves the first waiting customer).
The expected total time for you is:
E[Ttotal ] = Time until first server free + Service time of first waiting customer (8)
+ Time until next server free + Your service time (9)
This simplifies to:
5
E[Ttotal ] = (10)
µ1 + µ2 + µ3
Alternatively, we can think of it as: you need to wait for two service completions (the
person ahead of you, then yourself), each taking expected time µ1 +µ12 +µ3 to start, plus
the service times, giving us the same result.
3(a)Queueing Theory
(a) Probability Server 3 is Busy When Moving to Server 2
When you arrive, server 3 is busy. The time to complete service at server 1 (expo-
nential rate µ1 ) determines whether server 3 remains busy. Due to the memoryless
property of exponential distributions, the probability that server 3’s remaining service
time exceeds your service time at server 1 is:
µ1
P (server 3 busy) = (11)
µ1 + µ3
This results from comparing two exponential variables.
(b) Probability Server 3 is Busy When Moving to Server 3
Here, you must pass through servers 1 and 2 before reaching server 3. The total time
to complete both services is the sum of two independent exponentials (µ1 and µ2 ). The
probability server 3 remains busy is the product of the probabilities it outlasts each server
individually:
µ1 µ2
P (server 3 busy) = × (12)
µ1 + µ3 µ2 + µ3
This leverages independence from the memoryless property.
(c) Expected Total Time in the System
The expected time includes service times at all servers. Despite potential waiting at
server 3, the memoryless property ensures the expected residual service time at any
busy server equals its mean service time. Thus:
1 1 1
E[T ] = + + (13)
µ1 µ2 µ3
This aggregates the mean service times directly.
(d) Expected Time When Entering with a Customer at Server 2
3
If a customer is at server 2 when you enter, you must wait for them to finish (remaining
time: 1/µ2 ), then undergo your own service at servers 2 and 3. The total expected time
is:
1 1 1 1 1 2 1
E[T ] = + + + = + + (14)
µ1 µ2 µ2 µ3 µ1 µ2 µ3
This analysis leverages the memoryless property and superposition of exponential
rates, aligning with queuing theory principles.
Question 4
Prove that if N1 (t), t ≥ 0 and N2 (t), t ≥ 0 are independent Poisson processes with rates
λp and λ(1 − p), then they are independent.
Solution
(N1 (t), t ≥ 0) ∼ Poisson(λp) and (N2 (t), t ≥ 0) ∼ Poisson(λ(1 − p)).
Two random variables are independent if their joint probability mass function is equal
to the product of their marginal probability mass functions.
Let N (t) = N1 (t) + N2 (t) ∼ Poisson(λ).
P (N1 (t) = n, N2 (t) = m) = P (N1 (t) = n, N2 (t) = m|N (t) = n + m)
Given that N (t) = n + m, N (t) ∼ Binomial distribution with parameter n + m and p.
n+m n
P (N1 (t) = n, N2 (t) = m|N (t) = n + m) = p (1 − p)m
n
e−λt (λt)n+m
P (N (t) = n + m) =
(n + m)!
Thus:
e−λt (λt)n+m
n+m n
P (N1 (t) = n, N2 (t) = m) = p (1 − p)m ·
n (n + m)!
(n + m)! n e−λt (λt)n+m
P (N1 (t) = n, N2 (t) = m) = · p (1 − p)m ·
n!m! (n + m)!
(λt)n −λpt (λt)m −λ(1−p)t
P (N1 (t) = n, N2 (t) = m) = e · e
n! m!
P (N1 (t) = n, N2 (t) = m) = P (N1 (t) = n) · P (N2 (t) = m)
Therefore, N1 (t) and N2 (t) are independent. The two Poisson processes are indepen-
dent.
4
Marginal Probability Mass Function
e−λpt (λpt)n
P (N1 (t) = n) =
n!
e−λ(1−p)t (λ(1 − p)t)m
P (N2 (t) = m) =
m!
(P (N1 (t) = n), P (N2 (t) = m)) = (P (N1 (t) = n) · P (N2 (t) = m)
therefore the 2 poisson processes are independent.
Question 5
Families migrate at a Poisson rate λ = 2 per week. Family sizes are i.i.d. with probabil-
ities:
1 1 1 1 1
P (X = 1) = , P (X = 2) = , P (X = 3) = , P (X = 4) = , P (X = 5) = .
4 4 3 12 12
•
• (a).Expected Number of Individuals Migrating in 6 Weeks:
E(Xt ) = λt · E(X)
where:
5
X
E(X) = x · P (X = x) = 2.5, λt = 2 · 6 = 12
x=1
E(Xt ) = 12 · 2.5 = 30
• (b).Variance of Number of Individuals Migrating in 6 Weeks:
Var(Xt ) = λt · E(X 2 )
where:
5
X
2
E(X ) = x2 · P (X = x) = 23/3, λt = 12
x=1
23
Var(Xt ) = 12 · = 92
3