Part2 PDF
Part2 PDF
Part2 PDF
2
Definition 1 A process obeying the following three postulates is called a
birth-and-death process:
(3) At any time t, P {Ej → Ej±k during (t, t + h)|Ej at t} = o(h) as h → 0 for
k ≥ 2 (j = 0, 1, · · · ,).
µi−1
µi
µi+1
µi+2
··· Ei−1
- -
Ei Ei+1
- -
···
λi−2 λi−1 λi λi+1
Figure 2.1: The Birth and Death Process.
Notation: Let Pj (t) = P {N (t) = j} and let λ−1 = µ0 = P−1(t) = 0.
3
• Then it follows from the above postulates that (where h → 0; j =
0, 1, . . ..)
4
• Letting h → 0, we get the differential-difference equations
d
Pj (t) = λj−1Pj−1(t) + µj+1Pj+1(t) − (λj + µj )Pj (t). (3.1)
dt
At time t = 0 the system is in state Ei, the initial conditions are
{
1 if i = j
Pj (0) = δij where δij =
0 if i ̸= j.
Definition 2 The coefficients {λj } and {µj } are called the birth
and death rates respectively.
• Here
j = 0, P0′ (t) = −λP0(t),
hence
P0(t) = a0e−λt.
From the initial conditions, we get a0 = 1.
6
• Inductively, we can prove that if
(λt)j−1 −λt
Pj−1(t) = e
(j − 1)!
then the equation ( )
j−1
(λt)
Pj′(t) =λ e−λt − λPj (t)
(j − 1)!
gives the solution
(λt)j −λt
Pj (t) = e . (3.2)
j!
Remark 1 Probabilities (3.2) satisfy the normalization condition
∞
∑
Pj (t) = 1 (t ≥ 0).
j=0
Remark 2 For each t, N (t) is the Poisson distribution, given by {Pj (t)}. We say
that N (t) describes a Poisson process.
Remark 3 Since the assumption λj = λ is often a realistic one, the simple formula
(3.2) plays a central role in queueing theory.
7
3.1.1 Generating Function Approach
Here we demonstrate using the generating function approach for solving the
pure birth problem.
• The idea is that if we can find g(z, t) and obtain its coefficients when expressed
in a power series of z then one can solve Pn(t).
8
From the differential-difference equations, we have
∞
∑ ∞
∑ ∞
∑
dPn(t)
zn = z λPn(t)z n − λPn(t)z n.
n=0
dt n=0 n=0
Assuming one can inter-change the operation between the summation and the dif-
ferentiation, then we have
dg(z, t)
= λ(z − 1)g(z, t)
dt
when z is regarded as a constant.
• Then we have
g(z, t) = Keλ(z−1)t.
Since ∞
∑
g(z, 0) = Pn(0)z n = 1
n=0
we have K = 1. Hence we have
( 2
) ( −λt 2
)
(λtz) e (λt) 2
g(z, t) = e−λt 1 + λz + + . . . + = e−λt + e−λtλz + z + ...+ .
2! 2!
Then the result follows.
9
3.2 Pure Death Process
10
• The equation with j = n − 1 is
d
Pn−1(t) = nµPn(t) − (n − 1)µPn−1(t) = nµe−nµt − (n − 1)µPn−1(t).
dt
• Solving this differential equation and we get
Pn−1(t) = n(e−µt)n−1(1 − e−µt).
• Recursively, we get
( )
n
Pj (t) = (e−µt)j (1 − e−µt)n−j for j = 0, 1, · · · , n. (3.4)
j
Remark 4 For each t, the probabilities (3.4) comprise a binomial
distribution.
Remark 5 The number of equations in (3.3) is finite in number. For
a pure birth process, the number of equations is infinite.
11
3.3 More on Birth-and-Death Process
• We consider a queueing system with one server and no waiting position, with
P {one customer arriving during (t, t + h)} = λh + o(h)
and
P {service ends in (t, t + h)| server busy at t} = µh + o(h) as h → 0.
• This corresponds to a two state birth-and-death process with j = 0, 1. The arrival
rates are λ0 = λ and λj = 0 for j ̸= 0 (an arrival that occurs when the server is
busy has no effect on the system since the customer leaves immediately); and the
departure rates are µj = 0 when j ̸= 1 and µ1 = µ (no customers can complete
service when no customer is in the system).
µ
Idle
- Busy
λ
Figure 3.1. The Two-state Birth-and-Death Process.
12
The equations for the birth-and-death process are given by
d d
P0(t) = −λP0(t) + µP1(t) and P1(t) = λP0(t) − µP1(t). (3.5)
dt dt
One convenient way of solving this set of simultaneous linear differential equations
(not a standard method!) is as follows:
• Initial conditions are P0(0) + P1(0) = 1; thus P0(t) + P1(t) = 1. Hence we get
d
P0(t) + (λ + µ)P0(t) = µ.
dt
The solution (exercise) is given by
( )
µ µ
P0(t) = + P0(0) − e−(λ+µ)t.
λ+µ λ+µ
13
Since P1(t) = 1 − P0(t),
( )
λ λ
P1(t) = + P1(0) − e−(λ+µ)t. (3.6)
λ+µ λ+µ
• For the three examples of birth-and-death processes that we have considered, the
system of differential-difference equations are much simplified and can therefore be
solved very easily.
14
3.4 Statistical Equilibrium (Steady-State Probability Distribution)
Consider the state probabilities of the above example when t → ∞, from (3.6) we
have
µ
P0 = t→∞ lim P0(t) =
λ+µ
λ (3.7)
P1 = lim P1(t) = .
t→∞ λ+µ
We note that P0 + P1 = 1 and they are called the steady-state probabilities
of the system.
Remark 7 Both P0 and P1 are independent of the initial values P0(0) and P1(0).
If at time t = 0,
µ
P0(0) = = P0
λ+µ
λ
P1(0) = = P1,
λ+µ
(come from Eq. (3.6)) clearly show that these initial values will persist for ever.
15
• This leads us to the important notion of statistical equilibrium. We
say that a system is in statistical equilibrium (or the state dis-
tribution is stationary) if its state probabilities are constant in time.
• Note that the system still fluctuate from state to state, but there is
no net trend in such fluctuations.
16
Proposition 2 (a) Let Pj (t) be the state probabilities of a birth-and-death
process. Then
lim Pj (t) = Pj
t→∞
exist and are independent of the initial conditions; they satisfy the system of
linear difference equations obtained from the difference-differential equations
in previous chapter by replacing the derivative on the left by zero.
(b) If all µj > 0 and the series
λ0 λ0λ1 λ0λ1 · · · λj−1
S =1+ + + ··· + + ...+ (3.8)
µ1 µ1µ2 µ1µ2 · · · µj
converges, then
−1 λ0λ1 · · · λj−1 −1
P0 = S and Pj = S (j = 1, 2, · · · ).
µ1µ2 · · · µj
If the series in Eq. (3.8) diverges, then
Pj = 0 (j = 0, 1, · · · ) .
17
Proof: We shall NOT attempt to prove Part (a) of the above proposition, but
rather we assume the truth of (a) and use it to prove Part (b).
By using part (a) of the proposition, we obtain the following linear difference equa-
tions.
18
This implies
λj Pj = µj+1Pj+1.
By recurrence, we get (if µ1, · · · , µj > 0)
λ0λ1 · · · λj−1
Pj = P0 (j = 1, 2, · · · ). (3.11)
µ1 · · · µj
∑
Finally, by using the normalization condition Pj = 1 we have the result in part
(b).
Remark 8 Part (a) of the above proposition suggests that to find the statistical
equilibrium distribution
lim Pj (t) = Pj .
t→∞
We set the derivatives on the left side of difference-differential equations to be zero
and replace Pj (t) by Pj and then solve the linear difference equations for Pj . In
most cases, the latter method is much easier and shorter.
Remark 9 If µj = 0 for some j = k (λj > 0 for all j), then, as equation (3.11)
shows,
Pj = 0 for j = 0, 1, · · · , k − 1.
In particular, for pure birth process, Pj = 0 for all j.
19
Example 1 Suppose that for all i we have
λi = λ and µj = jµ
then ( )2 ( )3
λ 1 λ 1 λ λ
S =1+ + + + ...+ = e .
µ
µ 2! µ 3! µ
Therefore we have the Poisson distribution
( )j
1 λ −λ
Pj = eµ.
j! µ
Example 2 Suppose that for all i we have
λi = λ and µj = µ
such that λ < µ then
( )2 ( )3
λ λ λ 1
S =1+ + + + ...+ = .
µ µ µ 1−µλ
20
3.5 A Summary of Learning Outcomes
• Able to derive the general solutions for the pure birth process, the pure death
process and the two-state birth-and-death process.
• Able to give the condition for the existence of the steady-state probability of a
birth-and-death process.
21
3.6 Exercises
1. For the pure death process with death rate µj = jµ, prove that
( )
n
pj (t) = (e−µt)j (1 − e−µt)n−j (j = 0, 1, · · · , n)
j
where n is the state of the system at t = 0 and pj (t) is the probability that the
system is in State j at time t.
2. In a birth and death process, if
λi = λ/(i + 1) and µi = µ
show that the equilibrium distribution is Poisson.
3. Consider a birth-death system with the following birth and death coefficients:
{
λk = (k + 2)λ k = 0, 1, 2, . . .
µk = kµ k = 0, 1, 2, . . . .
All other coefficients are zero.
(a) Solve for pk . Make sure you express your answer explicitly in terms of λ, k
and µ only.
(b) Find the average number of customers in the system.
22
3.7 Suggested Solutions
1. In the assumed pure death process, actually the lifetime of each of the individual
follows the exponential distribution µe−µx. The probability that one can survive
over time t will be given by
∫ ∞
µe−µxdx = e−µt.
t
Therefore the probability that one can find j still survive at time t is given by
( )
n
(e−µt)j (1 − e−µt)n−j (j = 0, 1, · · · , n)
j
2. Let Pi be the steady-state probability and
λ0λ1 · · · λj−1
Pj = P0 (j = 1, 2, · · · )
µ1 · · · µj
λj P0
=
j!µj
∑∞
and P0 = ( i=0 Pi)−1 = (eλ/µ)−1. Hence we have a Poisson distribution
λj e−λ/µ
Pn = j
.
j!µ
23
3. (a) Because λk = (k + 2)λ and µk = kµ, we can get that
∑ ∞
λ λ j 1
s = 1 + 2 + · · · + (j + 1)( ) + · · · + = kρ k−1
=
µ µ (1 − ρ)2
k=1
so
P0 = 1/s = (1 − ρ)2
and
Pi = (i + 1)ρi(1 − ρ)2.
(b)
∞
∑ ∞
∑ 2ρ
E= iPi = 0 · P0 + k(k + 1)ρk (1 − ρ)2 = .
1−ρ
k=0 k=1
Note: ∞ ∞ ∞
∑ d ∑ d d ∑ k+1
k(k + 1)ρk−1 = k
(k + 1)ρ = [ ρ ]
dρ dρ dρ
k=1 k=1 k=1
24
4 Introduction to Queueing Systems
25
• In general, the arrival pattern of the customers and the service time allo-
cated to each customer can only be specified probabilistically. Such service facilities
are difficult to schedule “optimally” because of the presence of randomness element
in the arrival and service patterns.
• A mathematical theory has thus evolved that provides means for analyzing such
situations. This is known as queueing theory (waiting line theory, congestion
theory, the theory of stochastic service system), which analyzes the operating char-
acteristics of a queueing situation with the use of probability theory.
26
4.1 Basic Elements of Queueing Models
27
• In a machine shop with four machines (the machines are the cus-
tomers and the repairman is the server), the calling source before
any machine breaks down consists of four potential customers (i.e.
anyone of the four machines may break down and therefore calls
for the service of the repairman). Once a machine breaks down, it
becomes a customer receiving the service of the repairman (until
the time it is repaired), and only three other machines are capable
generating new calls for service.
28
(ii) Service Process: The time allocated to serve a customer (ser-
vice time) in a system (e.g. the time that a patient is served by
a doctor in an outpatient clinic) varies and is assumed to follow
some probability distribution.
• Some facility may include more than one server, thus allowing as
many customers as the number of servers to be serviced simulta-
neously (e.g. supermarket cashiers). In this case, all servers offer
the same type of service and the facility is said to have parallel
servers .
29
(iii) Queue Discipline: The manner that a customer is chosen
from the waiting line to start service is called the queue disci-
pline .
Suppose that customers only arrive at the end of each hour and
the probability that there is an arrival of customer is 0.5.
31
(ii) (Service Process) Suppose that there is a job to be processed
by a machine. The job requires a one-hour machine time. For a
reliable machine, it takes one hour to finish the job.
Let x be the number of hours to finish the job. Then the probability
that the job can be finished at the end of the nth hour is given by
the Geometric distribution
P (x = k) = pk−1(1 − p), k = 1, 2, . . . .
32
(iii) (Queueing Disciplines) Suppose there are three customers A, B and C
waiting at a counter for service and their service times are in the following order
10 minutes, 20 minutes and 30 minutes.
Clearly it takes 10 + 20 + 30 = 60 minutes to finish all the service. However,
the average waiting time before service for the three customers can be quite
different for different service disciplines.
Case 1: (FCFS): The waiting time for the first customer is zero, the waiting
time for the second customer is 10 minutes and the waiting time for the third
customers is 10 + 20 = 30 minutes. Therefore the average waiting time before
service is
(0 + 10 + 30)/3 = 40/3.
Case 2: (LCFS): The waiting time for the first customer is zero, the waiting
time for the second customer is 30 minutes and the waiting time for the third
customers is 30 + 20 = 50 minutes. Therefore the average waiting time before
service is
(0 + 30 + 50)/3 = 80/3
minutes which is twice of that in Case 1!
33
4.1.2 Definitions in Queueing Theory
• Let us now define formally some entities that are frequently used to measure the
effectiveness of a queueing system (with s parallel servers).
(i) pj = the probability that there are j customers in the system (waiting or in
service) at an arbitrary epoch (given that the system is in statistical equi-
librium or steady-state). Equivalently pj is defined as the proportion of
time that there are j customers in the system (in steady state).
(ii) a = offered load = mean number of requests per service time. (In a system
where blocked customers are cleared, requests that are lost are also counted.)
(iii) ρ = traffic intensity = offered load per server = a/s (s < ∞).
34
(iv) a′ = carried load = mean number of busy servers.
(v) ρ′ = server occupancy (or utilization factor) = carried load per server =
a′/s.
(vi) Ws = mean waiting time in the system,i.e the mean length of time
from the moment a customer arrives until the customer leaves the system (also
called sojourn time).
(vii) Wq = mean waiting time in the queue, i.e. the mean length of time
from the moment a customer arrives until the customer’ service starts .
35
Remark 10 If the mean arrival rate is λ and the mean service time is τ then the
offered load a = λτ .
Remark 11 For an s server system, the carried load
∑
s−1 ∞
∑
a′ = jpj + s pj .
j=0 j=s
Hence ∞
∑ a′
a′ ≤ s pj = s and ρ′ = ≤ 1.
j=0
s
36
4.1.3 Kendall’s Notation
38
4.2 Queueing Systems of One Server
In this section we will consider queueing systems having one server only.
• The inter-arrival time of customers follows the exponential distribution with pa-
rameter λ.
• The service time also follows the exponential distribution with parameter µ.
There is no waiting space in the system.
• An arrived customer will leave the system when he finds the server is busy (An
M/M/1/0 queue). This queueing system resembles an one-line telephone system
without call waiting.
39
4.2.2 Steady-state Probability Distribution
We are interested in the long-run behavior of the system, i.e., when t → ∞. Why?
Remark 14 Let P0(t) and P1(t) be the probability that there is 0 and 1 customer
in the system. If at t = 0 there is a customer in the system, then
µ
P0(t) = (1 − e−(λ+µ)t)
λ+µ
and
1
P1(t) = (µe−(λ+µ)t + λ).
λ+µ
Here P0(t) and P1(t) are called the transient probabilities. We have
µ
p0 = lim P0(t) =
t→∞ λ+µ
and
λ
p1 = lim P1(t) =
.
t→∞ λ+µ
Here p0 and p1 are called the steady-state probabilities.
40
• Moreover, we have
µ µe−(λ+µ)t
P0(t) − = → 0 as t → ∞
λ+µ λ+µ
and
λ µe−(λ+µ)t
P1(t) − = → 0 as t → ∞
λ+µ λ+µ
very fast.
• This means that the system will go into the steady state very fast.
41
4.2.3 The Meaning of the Steady-state Probability
• In the long run, the probability that there is no customer in the system is p0 and
there is one customer in the system is p1.
For the server: In other words, in the long run, the proportion of time that the
server is idle is given by p0 and the proportion of time that the server is busy is
given by p1.
For the customers: In the long run, the probability that an arrived customer
can have his/her service is given by p0 and the probability that an arrived customers
will be rejected by the system is given by p1.
Remark 15 The system goes to its steady state very quickly. In general it is
much easier to obtain the steady-state probabilities of a queueing system than the
transient probabilities. Moreover we are interested in the long-run performance of
the system. Therefore we will focus on the steady-state probabilities of a queueing
system.
42
4.2.4 The Markov Chain and the Generator Matrix
• In State 0, change of state occurs when there is an arrival of customers and the
waiting time is exponentially distributed with parameter λ.
• In State 1, change of state occurs when the customer finishes his/her service and
the waiting time is exponentially distributed with parameter µ.
• Recall that from the no-memory (Markov) property, the waiting time
distribution for change of state is the same independent of the past history (e.g.
how long the customer has been in the system).
µ
0 1
-
λ
Figure 4.1. The Markov Chain of the Two-state System.
43
• From the Markov chain, one can construct the generator matrix as follows:
( )
−λ µ
A1 = .
λ −µ
What is the meaning of the generator matrix? The steady-state probabilities will
be the solution of the following linear system:
( )( ) ( )
−λ µ p0 0
A1 p = = (4.1)
λ −µ p1 0
subject to p0 + p1 = 1.
Remark 16 Given a Markov chain (generator matrix) one can construct the cor-
responding generator (Markov chain). To interpret the system of linear equations.
We note that in steady state, the expected incoming rate and the expected out
going rate at any state must be equal. Therefore, we have the followings:
• At State 0: expected out going rate = λp0 = µp1 = expected incoming rate;
• At State 1: expected out going rate = µp1 = λp0 = expected incoming rate.
44
4.2.5 One-Server Queueing System with Waiting Space
µ
µ
µ
µ
0 1 2 3 4
- - - -
λ λ λ λ
45
Let the steady-state probability distribution be
p = (p0, p1, p2, p3, p4)T .
In steady state we have A2p = 0.
At State 0: the expected out going rate = λp0 = µp1 = expected incoming rate;
At State 1: the expected out going rate = (λ + µ)p1 = λp0 + µp2 = expected
incoming rate.
At State 2: the expected out going rate = (λ + µ)p2 = λp1 + µp3 = expected
incoming rate;
At State 3: the expected out going rate = (λ + µ)p3 = λp2 + µp4 = expected
incoming rate.
At State 4: the expected out going rate = µp4 = λp3 = expected incoming rate.
46
We are going to solve p1, p2, p3, p4 in terms of p0.
47
• To determine p0 we make use of the fact that
p0 + p1 + p2 + p3 + p4 = 1.
• Therefore
λ λ2 λ3 λ4
p0 + p0 + 2 p0 + 3 p0 + 4 p0 = 1.
µ µ µ µ
• Let ρ = λ/µ, we have
p0 = (1 + ρ + ρ2 + ρ3 + ρ4)−1
and
pi = p0ρi, i = 1, 2, 3, 4.
• What is the solution of a general one-server queueing system (M/M/1/n)
? We shall discuss it in the next section.
48
4.2.6 General One-server Queueing System
Consider a one-server queueing system with waiting space. The inter-arrival of cus-
tomers and the service time follows the Exponential distribution with parameters
λ and µ respectively.
• There is a waiting space of size n−2 in the system. An arrived customer will leave
the system only when he finds no waiting space left. This is an M/M/1/n−2 queue.
• We say that the system is in state i if there are i customers in the system. The
minimum number of customers in the system is 0 and the maximum number of
customers is n − 1 (one at the server and n − 2 waiting in the queue). Therefore
there are n possible states in the system. The Markov chain of the system is shown
in Figure 4.3.
µ
µ
µ
µ
0
- -
1 · · · - · · ·
s n−1
-
λ λ λ λ
49
• If we order the state from 0 up to n − 1, then the generator matrix for the Markov
chain is given by the following tridiagonal matrix A2:
0 1 2 3 ··· n − 3 n − 2 n −1
0 −λ µ 0
1 λ −λ. − µ .µ
2 .. .. ...
.. λ −λ − µ µ
(4.3)
.. .
λ −λ − µ µ
..
... ... ...
n−2 λ −λ − µ µ
n−1 0 λ −µ
• We are going to solve the probability distribution p. Let
p = (p0, p1, . . . , pn−2, pn−1)T
be the steady-state probability vector. Here pi is the steady-state probability that
there are i customers in the system and we have also
∑
n−1
A2p = 0 and pi = 1.
i=0
50
• To solve pi we begin with the first equation:
λ
−λp0 + µp1 = 0 ⇒ p1 = p0.
µ
We then proceed to the second equation:
λ λ λ2
λp0 − (λ + µ)p1 + µp2 = 0 ⇒ p2 = − p0 + ( + 1)p1 ⇒ p2 = 2 p0.
µ µ µ
Inductively, we may get
λ3
p3 = 3 p0,
µ
λ4
p4 = 4 p0,
µ
,...,
λn−1
pn−1 = n−1 p0.
µ
• Let ρ = λ/µ (the traffic intensity), we have
pi = ρip0, i = 0, 1, . . . , n − 1.
51
• To solve for p0 we need to make use of the condition
∑
n−1
pi = 1.
i=0
Therefore we get
∑
n−1 ∑
n−1
pi = ρip0 = 1.
i=0 i=0
One may obtain
1−ρ
p0 = .
1−ρ n
• Hence the steady-state probability distribution vector p is given by
1−ρ 2, . . . , ρn−1)T .
(1, ρ, ρ
1 − ρn
52
4.2.7 Performance of a Queueing System
(a) the probability that a customer finds no more waiting space left when he arrives
1 − ρ n−1
pn−1 = ρ .
1−ρ n
(b) the probability that a customer finds the server is not busy (he can have the
service immediately) when he arrives
1−ρ
p0 = .
1−ρn
53
(d) the expected number of customers in the system is given by
∑
n−1 ∑
n−1
Ls = ipi = ip0ρi
i=0 i=1 (4.5)
ρ − nρ + (n − 1)ρ
n n+1
= .
(1 − ρ)(1 − ρn)
(e) the expected number of customers in the queue
∑
n−1
Lq = (i − 1)pi
i=1
∑
n−1
= (i − 1)p0ρi
i=1 (4.6)
∑
n−1 ∑
n−1
= ip0ρi − p0ρi
i=1 i=1
ρ − (n − 1)ρ + (n − 2)ρn+1
2 n
= .
(1 − ρ)(1 − ρ )
n
54
Remark 18 To obtain the results in (d) and(e) we need the following results.
∑
n−1
1 ∑
n−1
iρi = (1 − ρ)iρi
1 − ρ i=1
i=1 ( n−1 )
1 ∑ ∑
n−1
= iρ −
i
iρi+1
1 − ρ i=1 i=1
(4.7)
1 ( )
= 2
ρ + ρ + ... + ρ n−1
− (n − 1)ρn
1−ρ
ρ + (n − 1)ρn+1 − nρn
= .
(1 − ρ) 2
55
4.3 Queueing Systems with Multiple Servers
Now let us consider a more general queueing system with s parallel and iden-
tical exponential servers. The customer arrival rate is λ and the service rate
of each server is µ. There are n − s − 1 waiting space in the system.
• The queueing discipline is again FCFS. When a customer arrives and finds all the
servers busy, the customer can still wait in the queue if there is waiting space avail-
able. Otherwise, the customer has to leave the system, this is an M/M/s/n − s − 1
queue.
• Before we study the steady-state probability of this system, let us discuss the
following example (re-visited).
• Suppose there are k identical independent busy exponential servers, let t be the
waiting time for one of the servers to be free (change of state), i.e. one of the
customers finishes his service.
56
• We let t1, t2, . . . , tk be the service time of the k customers in the system. Then
ti follows the Exponential distribution λe−λt and
t = min{t1, t2, . . . , tk }.
We will derive the probability density function of t.
• We note that
Prob(t ≥ x) = Prob(t1 ≥ x) · Prob(t2 ≥ x) . . . Prob(tk ≥ x)
(∫ ∞ )k
= λe−λtdt (4.8)
x
−λx k
= (e ) = e−kλx.
• Thus ∫ ∞
f (t)dt = e−kλx and f (t) = kλe−kλt.
x
Therefore the waiting time t is still exponentially distributed with parameter
kλ.
57
pm
p
µ
pm
p
µ
pm
p
µ
.. p p · · ·p p
p p p p ··· λ
µ pm
p
1 2 3 · · ·k ··· n − s − 1
pm
p
µ
pm
p pm p
µ : customer being served
: customer waiting in queue
p p
58
The Markov chain for the queueing system is given in the following figure.
µ
2µ
sµ
sµ
0
- -
1 · · · - · · · -
s n−1
λ λ λ λ
59
4.3.1 A Two-server Queueing System
60
• From the first equation −λp0 + µp1 = 0, we have
λ
p1 = p0.
µ
• From the second equation λp0 − (λ + µ)p1 + 2µp2 = 0, we have
λ2
p2 = 2
p0 .
2!µ
• From the third equation λp1 − (λ + 2µ)p2 + 2µp3 = 0, we have
λ3
p3 = p0 .
2 · 2!µ3
• Finally from the fourth equation
λp2 − (λ + 2µ)p3 + 2µp4 = 0,
we have
λ4
p4 = 2 p0 .
2 · 2!µ 4
61
To determine p0 we make use of the fact that
p0 + p1 + p2 + p3 + p4 = 1.
Therefore
λ λ2 λ3 λ4
p0 + p0 + p0 + p0 + 2 4 p0 = 1.
µ 2!µ2 2 · 2!µ3 2 2!µ
Let
τ = λ/(2µ)
we have ( )
3 −1
λ λ 1−τ
2
p0 = 1 + + ( 2 )
µ 2!µ 1 − τ
and
λ
p1 = p0
µ
and ( )
λ2 i−2
pi = p0 τ , i = 2, 3, 4.
2!µ2
62
• The result above can be further extended to the M/M/2/k queue as follows:
( )
k+1 −1
λ λ 1−τ
2
λ λ2 i−2
p0 = 1 + + ( 2 ) , p1 = p0, and pi = p0( 2 )τ , i = 2, . . . , k+2.
µ 2!µ 1 − τ µ 2!µ
The queueing system has finite number of waiting space.
• The result above can also be further extended to M/M/2/∞ queue when τ =
λ/(2µ) < 1 as follows:
( 2
)−1
λ λ 1 λ λ2 i−2
p0 = 1 + + ( 2 ) , p1 = p0, and pi = p0( 2 )τ , i = 2, 3, . . . .
µ 2!µ 1 − τ µ 2!µ
or
1−τ
p0 = , and pi = 2p0τ i, i = 1, 2, . . . .
1+τ
The queueing system has infinite number of waiting space.
63
4.3.2 Expected Number of Customers in the M/M/2/∞ Queue
Now we let ∞
∑
S= kτ k = τ + 2τ 2 + . . . +
k=1
and we have
τ S = τ 2 + 2τ 3 + . . . + .
Therefore by subtraction we get
τ
(1 − τ )S = τ + τ 2 + τ 3 + . . . + =
1−τ
and
τ
S= .
(1 − τ ) 2
We have
2τ
Ls = . (4.11)
1−τ 2
64
4.3.3 Multiple-Server Queues
In the following, we assume that the Poisson input has rate λ and the
exponential service times have mean µ−1 .
65
4.3.4 Blocked Customers Cleared (Erlang loss system)
66
Let pi be the steady-state probability that there are i customers in
the queueing system. Then by solving
∑ s
A5p = 0 with pi = 1
i=0
one can get
j /j!
(λ/µ)
pj = s (j = 0, 1, · · · , s)
∑
(λ/µ)k /k!
k=0 (4.12)
aj /j!
= s
∑ k
a /k!
k=0
and pj = 0 for j > s ; where a = λ/µ is the offered load.
Since service times are exponential, the service completion rates (the death rates)
are
µj = jµ (j = 0, 1, 2, · · · , s).
68
Remark 20 The mean number of busy servers, which is also equal to
the mean number of customers completing service per mean service
time, is given by the carried load
∑s
a′ = jpj .
j=1
An interesting relation can be derived between the Erlang loss formula
and the carried load:
∑ s ∑s
a′ = j(aj /j!)/ (ak /k!)
j=1 k=0
∑
s−1 ∑
s
= a (aj /j!)/ (ak /k!)
j=0 k=0
= a (1 − B(s, a)) .
This shows that the carried load is the portion of the offered load that
is not lost (captured) from the system.
69
4.3.5 Blocked Customers Delayed (Erlang delay system)
The queueing system has s servers and there is no limit in waiting space
and we assume blocked customers are delayed.
70
• If a < s, the infinite geometric sum on the right converges, and
−1
∑
s−1 k
a a s
p0 = + .
k! (s − 1)!(s − a)
k=0
If a ≥ s, the infinite geometric sum diverges to infinity. Then p0 = 0
and hence pj = 0 for all finite j.
71
Remark 21 The probability that all servers are occupied (as observed
by an outside observer) is given by the Erlang delay formula
∞
∑ as 1 as/[(s − 1)!(s − a)]
C(s, a) = pj = p0 = .
(s − 1)! s − a ∑ k
s−1
j=s ( a /k!) + as/[(s − 1)!(s − a)]
k=0
Since the arriving customer’s distribution is equal to the outside ob-
server’s distribution, the probability that an arriving customer finds
all servers busy (equivalently the probability that the waiting time in
the queue w > 0) is also given by C(s, a).
Remark 22 The carried load is equal to the offered load since no
request for service has been cleared from the system without being
served. In fact, this equality holds for BCD queues with arbitrary
arrival and service time distributions.
72
Remark 23 Suppose that an arriving customer finds that all the servers are busy.
What is the probability that he finds j customers waiting in the ‘queue’ ?
73
4.4 Little’s Queueing Formula
If λ is the mean arrival rate, W is the mean time spent in the system
(mean sojourn time) and L is the mean number of customers present,
J.D.C. Little proved in 1961 that
L = λW.
This result is one of the most general and useful results in queueing
theory for a blocked customer delay queue.
The formal proof of this theorem is too long for this course. Let us
just formally state the theorem and then give a heuristic proof.
74
Proposition 3 (Little’s Theorem) Let L(x) be the number of
customers present at time x, and define the mean number L of
customers present throughout the time interval [0, ∞) as
∫ t
1
L = lim L(x)dx;
t→∞ t 0
let N (t) be the number of customers who arrive in [0, t], and
define the arrival rate λ as
N (t)
λ = lim ;
t→∞ t
and let Wi be the sojourn time of the ith customer, and define
the mean sojourn time W as
1 ∑n
W = lim Wi.
n→∞ n
i=1
If λ and W exist and are finite, then so does L, and they are
related by λW = L.
75
Proof: Let us follow the heuristic argument suggested by P. J. Burke.
Assume that the mean values L and W exist, and consider a long time interval
(0, t) throughout which statistical equilibrium (steady state) prevails.
The mean number of customers who enter the system during this interval is λt.
Imagine that a sojourn time is associated with each arriving customer; i.e., each
arrival brings a sojourn time with him. Thus the average sojourn time brought into
the system during (0, t) is λtW .
On the other hand, each customer present in the system uses up his sojourn time
linearly with time. If L is the average number of customers present throughout
(0, t), then Lt is the average amount of time used up in (0, t).
Now as t → ∞ the accumulation of sojourn time must equal the amount of sojourn
time used up; that is,
λtW
lim = 1.
t→∞ Lt
76
With the help of Little’s formula, we get the following useful results:
(a) λ, the average number of arrivals entering the system,
(b) Ls, the average number of customers in the queueing system,
(c) Lq , the average number of customers waiting in the queue,
(d) Lc, the average number of customers in the server,
(e) Ws, the average time a customer spends in the queueing system,
(f ) Wq , the average time a customer spends in waiting in the queue,
(g) Wc, the average time a customer spends in the server.
then the Little’s formula states that if the steady-state probability distribution
exists, we have
77
4.4.1 Little’s queueing Formula for the M/M/1/∞ Queue
In the following, we are going to prove Little’s queueing formula for the case of
M/M/1/∞ queue. We recall that
ρ ρ2 λ
Ls = , Lq = , Lc = ρ, Ls = Lq + Lc, ρ = .
1−ρ 1−ρ µ
We first note that the expected waiting time Wc at the server is 1/µ.
Therefore we have
1 λ Lc
Wc =
= = .
µ λµ λ
Secondly we note that when a customer arrived, there can be i customers already
in the system. The expected waiting time before joining the server when there are
already i customers in the system is of course i/µ. Because there is only server and
the mean service time of each customer in front of him is 1/µ.
Therefore the expected waiting time Wq before one joins the server will be
∞
∑ ∞
i 1∑ Ls ρ
pi ( ) = ipi = = .
i=1
µ µ i=1 µ (1 − ρ)µ
78
Since i can be 0, 1, 2, . . ., we have
ρ ρ2 Lq
Wq = = =
(1 − ρ)µ (1 − ρ)µρ λ
The expected waiting time at the server Wc will be of course 1/µ. Thus we have
Ws = Wq + Wc
Lq 1
= +
µ µ
1 ρ
= ( + 1)
µ 1−ρ
1
=
µ(1 − ρ)
ρ
=
λ(1 − ρ)
Ls
= .
λ
Here
ρ = λ/µ
and
Ls = ρ/(1 − ρ).
79
4.4.2 Applications of the Little’s Queueing Formula
Arrival rate λ
Service rate µ
Traffic intensity ρ = λ/µ
Probability that no customer in the queue p0 = 1 − ρ
Probability that i customers in the queue pi = p0ρi
Probability that an arrival has to wait for service 1 − p0 = ρ
Expected number of customers in the system Ls = ρ/(1 − ρ)
Expected number of customers in the queue Lq = ρ2/(1 − ρ)
Expected number of customers in the server Lc = ρ
Expected waiting time in the system Ls/λ = 1/(1 − ρ)µ
Expected waiting time in the queue Lq /λ = ρ/(1 − ρ)µ
Expected waiting time in the server Lc/λ = 1/µ
Table 4.1. A summary of the M/M/1/∞ queue.
80
Example 3 Consider the M/M/2/∞ queue with arrival rate λ and
service rate µ. What is the expected waiting time for a customer in
the system?
We recall that the expected number of customers Ls in the system is
given by
2ρ
Ls = .
1−ρ 2
Here ρ = λ/(2µ). By applying the Little’s queueing formula we have
Ls 1
Ws = = .
λ µ(1 − ρ )
2
81
Arrival rate λ = 30 (per hour)
Service rate µ = 60 (per hour)
Traffic intensity ρ = 0.5
Probability that no customer in the queue p0 = 0.5
Probability that i customers in the queue pi = 0.5i+1
Probability that an arrival has to wait for service 0.5
Expected number of customers in the system Ls = 1
Expected number of customers in the queue Lq = 0.5
Expected number of customers in the server Lc = 0.5
Expected waiting time in the system Ls/λ = 1/30
Expected waiting time in the queue Lq /λ = 1/60
Expected waiting time in the server Lc/λ = 1/60
Table 4.2. A summary of the system performance
82
4.5 Applications of Queues
4.5.1 Allocation of the Arrivals in a System of M/M/1 Queues
µ1
1 Queue 1
λ1
•• M
• Allocation
µn
s Queue n
λn
83
• An allocation process is implemented such that it diverts an arrived
customers to queue i with probability
λi λi
= .
λ1 + . . . + λn M
• Then the input process of queue i is a Poisson process with rate λi.
• The objective here is to find the parameters λi such that some sys-
tem performance is optimized.
84
4.5.2 Minimizing Number of Customers in the System
and solving
∂L ∂L
= 0 and =0
∂λi ∂m
we have the optimal solution
( )
1
λi = µi 1 − √ < µi
mµi
where 2
∑
n
√
µi
i=1
m= n .
∑
µi − M
i=1
86
4.5.3 Minimizing Number of Customers Waiting in the System
subject to
∑
m
λi = M
i=1
and
0 ≤ λi < µi for i = 1, 2, . . . , n.
87
By consider the Lagrangian function
( )
∑
n 2
(λi/µi) ∑
n
L(λ1, . . . , λn, m) = −m λi − M
i=1
1 − λi/µi i=1
and solving
∂L ∂L
= 0 and =0
∂λi ∂m
we have the optimal solution
( )
1
λi = µi 1 − √ < µi
1 + mµi
where m is the solution of
∑ n ( )
1
µi 1 − √ = M.
i=1
1 + mµi
88
4.5.4 Which Operator to Employ?
• Suppose the mean number of workers seeking for tools per hour is
5 and each worker is paid 8 dollars per hour.
89
• For Operator A, the service rate is µ = 60/10 = 6 per hour.
Thus we have
ρ = λ/µ = 5/6.
The expected number of workers waiting for tools at the tool centre
will be
ρ 5/6
= = 5.
1 − ρ 1 − 5/6
The expected delay cost of the workers is
5 × 8 = 40
dollars per hour and the operator cost is 5 dollars per hour. Therefore
the total expected cost is
40 + 5 = 45.
90
• For Operator B, the service rate is µ = 60/11 per hour. Thus
we have
ρ = λ/µ = 11/12.
The expected number of workers waiting for tools at the tool centre
will be
ρ 11/12
= = 11.
1 − ρ 1 − 11/12
The expected delay cost of the workers is
11 × 8 = 88
dollars per hour and the operator cost is 3 dollars per hour. Therefore
the total expected cost is
88 + 3 = 91.
Conclusion: Operator A should be employed.
91
4.5.5 Two M/M/1 Queues Or One M/M/2 Queue ?
92
• To determine which case is better, we calculate the expected number of customers
(workers) in both cases. Clearly in our consideration, the smaller the better (Why?).
In case (i), the expected number of customers in any one of the queues will be given
by
λ
( 2µ )
.
1 − ( 2µ )
λ
Conclusion: Case (ii) is better. We should put all the servers (operators) together.
93
4.5.6 One More Operator?
• Operator A later complains that he is overloaded and the workers have wasted
their time in waiting for a tool. To improve this situation, the senior manager
wonders if it is cost effective to employ one more identical operator at the tool
centre. Assume that the inter-arrival time of workers and the processing time of
the operators are exponentially distributed.
• For the present situation, one may regard the request for tools as a queueing
process (An M/M/1/∞) where the arrival rate λ = 5 per hour and the service rate
µ = 60/10 = 6 per hour. Thus we have ρ = λ/µ = 5/6.
• The expected number of workers waiting for tools at the tool centre will be
ρ 5/6
= = 5.
1 − ρ 1 − 5/6
The expected delay cost of the workers is 5×8 = 40 dollars per hour and the opera-
tor cost is 5 dollars per hour. Therefore the total expected cost is 40+5 = 45 dollars.
• When one extra operator is added then there are 2 identical operators at the tool
center and this will be an M/M/2/∞ queue.
94
• The expected number of workers in the system is given by (c.f. (4.11))
∞
1−ρ∑ i 2ρ
2iρ =
1 + ρ i=1 1 − ρ2
where
λ 5
ρ= = .
2µ 12
• In this case the expected delay cost and the operator cost will be given respectively
by
8 × 2ρ 8 × 120
= = 8.07 and 2 × 5 = 10 dollars.
1 − ρ2 119
• Thus the expected cost when there are 2 operators is given by 18.07 dollars.
• Conclusion: Hence the senior manager should employ one more operator.
• How about employing three operators? (You may consider M/M/3/∞ queue).
95
4.6 An Unreliable Machine System
Figure 4.10. The Markov Chain for the Unreliable Machine System.
96
• Let the steady-state probability vector be
p = (p0, p1, . . . , pn)
satisfies
A6 p = 0
where
−λ 0 µn
λ −µ1
A6 = µ1 −µ2 .
... ... 0
0 µn−1 −µn
97
• From the first equation −λp0 + µnpn we have
λ
pn = p0.
µn
From the second equation λp0 − µ1p1 we have
λ
p1 = p0.
µ1
From the third equation µ1p1 − µ2p2 we have
λ
p2 = p0.
µ2
We continue this process and therefore
λ
pi = p0.
µi
Since p0 + p1 + p2 + . . . + pn = 1, we have
( )
∑λ
n
p0 1 + = 1.
µ
i=1 i
Therefore ( )−1
∑
n
λ
p0 = 1+ .
i=1
µi
98
4.7 A Reliable One-machine Manufacturing System
• The demand is served in a first come first serve manner. In order to retain the
customers, there is no backlog limit in the system. However, there is an upper limit
n(n ≥ 0) for the inventory level.
• The machine keeps on producing until this inventory level is reached and the
production is stopped once this level is attained. We seek for the optimal value of
n (the hedging point or the safety stock) which minimizes the expected running cost.
• The running cost consists of a deterministic inventory cost and a backlog cost.
In fact, the optimal value of n is the best amount of inventory to be kept in the
system so as to hedge against the fluctuation of the demand.
99
• Let us summarized the notations as follows.
100
• Here we assume that µ > λ, so that the steady-state probabil-
ity distribution of the above M/M/1 queue exists and has analytic
solution
q(i) = (1 − p)pn−i, i = n, n − 1, n − 2, · · ·
where
p = λ/µ
and q(i) is the steady-state probability that the inventory level is i.
• Hence the expected running cost of the system (sum of the inventory
cost and the backlog cost) can be written down as follows:
∑
n ∞
∑
E(n) = I (n − i)(1 − p)pi + B (i − n)(1 − p)pi . (4.13)
| i=0 {z } | i=n+1 {z }
inventory cost backlog cost
101
Proposition 4 The expected running cost E(n) is minimized if
the hedging point n is chosen such that
n+1 I
p ≤ ≤ pn.
I +B
Proof: We note that
∑
n−1
E(n − 1) − E(n) = B − (I + B)(1 − p) pi = −I + (I + B)pn
i=0
and
∑
n
E(n + 1) − E(n) = −B + (I + B)(1 − p) pi = I − (I + B)pn+1.
i=0
Therefore we have
I I
E(n−1) ≥ E(n) ⇔ pn ≥ and E(n+1) ≥ E(n) ⇔ p n+1 ≤ .
I +B I +B
I ≤ pn.
Thus the optimal value of n is the one such that pn+1 ≤ I+B
102
4.8 An Inventory Model with Returns and Lateral Transshipments
103
(i) λ−1, the mean inter-arrival time of demands,
(ii) µ−1, the mean inter-arrival time of returns,
(iii) a, the probability that a returned product is repairable,
(iv) Q, maximum inventory capacity,
(v) I, unit inventory cost,
(vi) R, cost per replenishment order.
6 Replenishment
Disposal (1 − a)µ
Returns Check/ ?
-
Repair - -1 0 1 · · · · · · Q Demands
-
λ
µ aµ
figure 4.16. The Single-item Inventory Model.
• W. Ching, W. Yuen and A. Loh, An Inventory Model with Returns and Lateral
Transshipments, J. Operat. Res. Soc., 54 (2003) 636-641.
104
λ
λ
λ
0
- -
1 · · · -
Q−1 Q
aµ aµ aµ 6
λ
figure 4.17. The Markov Chain.
• The (Q + 1) × (Q + 1) system generator matrix is given as follows:
0 λ + aµ −λ 0
1 −aµ λ + aµ −λ
.
A= . . . . . . . . . . . (4.14)
..
−aµ λ + aµ −λ
Q −λ −aµ λ
• The steady state probability distribution p is given by
pi = K(1 − ρi+1), i = 0, 1, . . . , Q (4.15)
where
aµ 1−ρ
ρ= and K = .
λ (1 + Q)(1 − ρ) − ρ(1 − ρQ+1)
105
Proposition 5 The expected inventory level is
Q Q ( )
∑ ∑ Q(Q + 1) QρQ+2 ρ2(1 − ρQ)
ipi = K(i−iρi+1) = K + − ,
2 1−ρ (1 − ρ)2
i=1 i=1
the average rejection rate of returns is
µpQ = µK(1 − ρQ+1)
and the mean replenishment rate is
λ−1 λK(1 − ρ)ρ
λ × p0 × −1 −1
= .
λ + (aµ) (1 + ρ)
106
Proposition 6 If ρ < 1 and Q is large then K ≈ (1 + Q)−1 and the
approximated average running cost (inventory and replenishment
cost)
QI λ(1 − ρ)ρR
C(Q) ≈ + .
2 (1 + ρ)(1 + Q)
The optimal replenishment size
√ √ ( )
∗ 2λ(1 − ρ)ρR 2aµR 2λ
Q +1≈ = −1 . (4.16)
(1 + ρ)I I λ + aµ
107
4.9 Queueing Systems with Two Types of Customers
In this section, we discuss queueing systems with two types of customers. The
queueing system has no waiting space. There are two possible cases: infinite-server
case and finite-server case.
Consider the infinite-server queue with two types of customers. The arrival process
of customers of type i (i = 1, 2) is Poisson with rate λi and their service times are
independent, identically distributed, exponential random variables with mean µ−1 i
(i = 1, 2).
• Here p(j1, j2) is the steady-state probability that there are j1 type 1 customers
and j2 type 2 customers in the system.
108
• By equating expected rate out to expected rate in for each state, the equilib-
rium state equations are
(λ1 + λ2 + j1µ1 + j2µ2)p(j1, j2) = λ1p(j1 − 1, j2) + (j1 + 1)µ1p(j1 + 1, j2)
+λ2p(j1, j2 − 1) + (j2 + 1)µ2p(j1, j2 + 1)
and [p(−1, j2) = p(j1, −1) = 0 ; j1 = 0, 1, . . . ; j2 = 0, 1, . . . .]
Ej1,j2+1
6 (j + 1)µ
2 2
λ1 -λ2 ? λ1 -
Ej1−1,j2 Ej1,j2 Ej1+1,j2
j1µ1 6 (j1 + 1)µ1
λ2 j2µ2
Ej?1,j2−1
Figure 4.12. The Markov Chain of the System at State Ej1,j2 .
109
• In addition, the probabilities must satisfy the normalization equation
∞ ∑
∑ ∞
p(j1, j2) = 1.
j1 =0 j2 =0
• In this case, we already know the answer. Since the number of servers is infinite,
the two types of customers do not affect one another.
• Thus the marginal distribution of the number of customers of each type is that
which would be obtained by solving the corresponding one-dimensional problem,
namely the Poisson distribution:
∞
∑ (λ /µ ) j
p (j) = p(j, k) =
1 1
e −(λ1 /µ1 )
,
1
j!
k=0
∑∞ (4.17)
(λ2/µ2)j −(λ2/µ2)
p2(j) = p(k, j) =
j!
e .
k=0
110
• Since the number of customers present of each type is independent of the
number present of the other type, therefore
p(j1, j2) = p1(j1)p2(j2)
(λ1/µ1)j1 (λ2/µ2)j2 −[(λ1/µ1)+(λ2/µ2)] (4.18)
= e .
j1! j2!
• Solution of Product Form:
The fact that the solution p(j1, j2) can be decomposed into a product of two
factors has enabled us to solve the problem with ease.
111
• In fact one may try
(λ1/µ1)j1 (λ2/µ2)j2
p(j1, j2) = C (4.19)
j1! j2!
where the constant C is determined from the normalization equation. In this case
∞ ∑
∑ ∞ ∑∞ ∑ ∞
(λ1/µ1)j1 (λ2/µ2)j2
p(j1, j2) = C
j1 =0 j2 =0 j1 =0 j2 =0
j1! j2!
∑∞
λ2 /µ2 (λ1/µ1)j1
= Ce
j =0
j1!
1
[(λ1 /µ1 )+(λ2 /µ2 )]
= Ce .
Hence
C = e−[(λ1/µ1)+(λ2/µ2)].
• In practice, a good strategy for finding solutions to equations of the form (4.17) is
to assume a product solution of the form (4.19); and see if such a solution satisfies
(4.17).
• If it goes, then the solution has been obtained. If it doesn’t, then try a different
approach. In this case it works!
112
4.9.2 Multiple-server Queue with Blocked Customers Cleared
113
• Since the product-form solution (4.19) satisfies the equation (4.17) and also the
equation with only the deleted terms
(λ1 + λ2)p(j1, j2) = (j1 + 1)µ1p(j1 + 1, j2) + (j2 + 1)µ2p(j1, j2 + 1),
therefore it also satisfies (4.20). Thus the product solution (4.19) is a solution of
this problem.
• In particular, if we don’t distinguish the two types of customers, then the prob-
ability p(j) that there are j customers (including type 1 and type 2) in service is
given by
∑ ∑
j
p(j) = p(j1, j2) = p(j1, j − j1).
j1 +j2 =j j1 =0
• With the help of binomial theorem, we have
∑ C∑
( µλ11 )j1 ( µλ22 )j−j1 1 ( λ1 λ2 )j
j j
j! λ1 j1 λ2 j−j1
p(j) = C = ( ) ( ) =C + .
j1 =0
j1! (j − j1)! j! j =0 j1!(j − j1)! µ1 µ2 j! µ1 µ2
1
114
• The normalization equation
∑
s
p(k) = 1
k=0
implies that { s }−1
∑ 1 λ1 λ2
C= ( + )k .
k! µ1 µ2
k=0
We conclude that
aj /j!
p(j) = ∑
s (j = 0, 1, . . . , s)
ak /k!
k=0
where
λ1 λ2
a=( ) + ( ).
µ1 µ2
• We note that for the case of infinite-server queue (we let s → ∞) we have
aj e−a
p(j) = (j = 0, 1, . . . , ).
j!
115
4.10 Queues in Tandem
• Consider two sets of servers arranged in tandem, so that the output from the first
set of servers is the input of the second set of servers.
• Assume that the arrival process at the first stage of this tandem queueing system
is Poisson with rate λ, the service times in the first stage are exponentially dis-
tributed with mean µ−1
1 , and the queueing discipline is blocked customers delayed.
• The customers completing service in the first stage will enter the second stage
(and wait if all servers in second stage are busy), where the service times are as-
sumed to be exponentially distributed with mean µ−1 2 . The number of servers in
stage i is si(i = 1, 2) .
116
.. ..
··· ···
117
• Since the first stage of this system is precisely an Erlang delay system, so
that the marginal distribution of the number of customers in stage one is given by
the Erlang delay probabilities.
118
• We shall substitute the assumed product solution (4.22) and (4.23) into the
equilibrium state equations (4.21), with the hope that the system of equations
will be reduced to a one-dimensional set of equations that can be solved for the
remaining factor p2(j2). Indeed, (4.21) is reduced to
119
• Using
∞
∑
pi(j) = 1 (i = 1, 2)
j=0
we get
i−1 k
( s∑ s
i )−1
ρ i + ρ i λ
Ci = where ρi = (i = 1, 2).
k! si!(1 − ρi/si) µi
k=0
This equation implies that a proper joint distribution exists only when
λ/µ1 < s1 and λ/µ2 < s2.
120
• It also suggests that the output from the first stage is a Poisson
process. This is in fact true in general and we state without proof the
Burke’s theorem as follows.
121
4.11 Queues in Parallel
..
··· λ1 λ1
CO
C
C
C
C
C
C
C
..
C
C
··· λ2
C
λ2
λ̃ = λ-1 + λ2 λ̃ = λ-1 + λ2
E0,2 µ E1,2 µ E2,2
6
1 6
1 6
µ2 λ2 λ 1
µ2 λ2 λ1
µ2 λ̃ = λ1 + λ2
? - ? - ?
E0,1 µ1 E1,1 µ1 E2,1
6 6 6
µ2 λ2 λ 1
µ2 λ2 λ1
µ2 λ̃ = λ1 + λ2
? - ? - ?
E0,0 µ1 E1,0 µ1 E2,0
figure 4.15. The Markov Chain of the Two Queue Overflow System.
124
• The generator matrix for this queueing problem is given by
(0, 0) * µ1 0 µ2 0 0 0 0 0
(1, 0) λ1 * µ1 0 µ2 0 0 0 0
(2, 0) 0 λ1 * 0 0 µ2 0 0 0
(0, 1) λ2 0 0 * µ1 0 µ2 0 0
A8 = (1, 1) 0 λ2 0 λ1 * µ1 0 µ2 0 . (4.26)
(2, 1) 0 0 λ̃ 0 λ1 ∗ 0 0 µ2
(0, 2) 0 0 0 λ2 0 0 * µ1 0
(1, 2) 0 0 0 0 λ2 0 λ̃ * µ1
(2, 2) 0 0 0 0 0 λ̃ 0 λ̃ *
• Here “∗” is such that the column sum is zero.
125
4.12 A Summary of Learning Outcomes
• Able to state and show the Little’s queueing formula and Burke’s theorem.
• Able to apply queues in tandem and in parallel to real problems such as:
- Employment of operators.
- Unreliable machine system.
- Manufacturing system and inventory system.
126
4.13 Exercises
127
3. An company offers services that can be modeled as an s-server Erlang loss sys-
tem (M/M/s/0 queue).
Suppose the arrival rate is 2 customers per hour and the average service time
is 1 hour. The entrepreneur earns $2.50 for each customer served and the com-
pany’s operating cost is $1.00 per server per hour (whether the server is busy
or idle).
(a) Write down the expected hourly net profit C(s) in terms of s.
(b) Show that
C(s)
lim = −1
s→∞ s
and interprets this result.
(c) If the maximum number of servers available is 5, what is the optimal number
of servers which maximizes the expected hourly net profit ? What is the ex-
pected hourly net profit earned when the optimal number of servers is provided?
128
4. Consider an Erlang loss system with two servers. The arrival rate is 1 and mean
service time is µ−1
i for server i(i = 1, 2). When the system is idle, an arrived
customer will visit the first server. Denote the states of the system as
(0, 0), (1, 0), (0, 1), (1, 1)
where (0, 0) denotes the state that both servers are idle, (1, 0) denotes the state
that the first server is busy and the second server is idle, (0, 1) denotes the state
that the first server is idle and the second server is busy and (1, 1) denotes the
state that both servers are busy.
(a) We adopt the order of states: (0, 0), (1, 0), (0, 1), (1, 1), write down the
generator matrix for this Markov process.
(b) Find the proportion of customers loss B(µ1, µ2) in terms of µ1 and µ2
(c) Show that it is better to put the faster server as the first server, in order to
achieve lower proportion of customers loss.
129
5. We consider a closed queueing system of two one-server queues in tandem. There
are n + 1 customers in this system. When a customer finishes his/her service at
Server 1 he/she will go for the service at server 2 (join Queue 2). After he/she
finishes the service at Server 2, he/she will go for the service at server 1 (join
Queue 1) again. The service time at server i is exponentially distributed with
mean µ−1i .
Server 2 Server 1
µ2 · · · µ1 ···
Queue 2 Queue 1
(a) Let Ei,j be the state that there are i customers in the Queue 1 (including
the one at the server) and j customers in Queue 2 (including the one at the
server). Write down the Markov chain for this queueing system.
(b) Compute the steady-state probability pi,j (i customers in Queue 1 and j
customers in Queue 2).
130
4.14 Suggested Solutions
1. The mean number of customers joining the system per mean service time is
ρ(1 − pn+1)
and the mean number of customers completing service per mean service time is
1 − p0. Hence
ρ(1 − pn+1) = 1 − p0
and
1 − p0
ρ= .
1 − pn+1
2. (a) The Markov chain is given by
1/4
2/4
3/4
0 1 2 3
- - -
1/3 1/3 1/3
131
Let the steady-state probability distribution be
p = (p0, p1, p2, p3)t.
We have
p1 = (4/3)p0, p2 = 2((7/12)(4/3)p0 − (1/3)p0) = (8/9)p0,
p3 = (4/3)(1/3)(8/9)p0 = (32/81)p0
and
−1 81
p0 = (1 + 4/3 + 8/9 + 32/81) = .
293
Hence
81
p= (1, 4/3, 8/9, 32/81)t.
293
(b) Proportion of rejected calls = p3 = 3281 × 293 = 293 .
81 32
132
3. (a) λ = 2 and µ = 1 and we let a = λ/µ = 2 then
∑
s ∑
s
p0 = ( ak /k!)−1 = ( 2k /k!)−1.
k=0 k=0
133
(b) We note that
( )−1
∑
s k
2 2s
−2
lim =e and lim = 0.
s→∞ k! s→∞ s!
k=0
Hence
C(s) −s
lim = lim = −1.
s→∞ s s→∞ s
This means that when the number of servers s is large, the profit will be domi-
nated by the operating cost −s.
(c) We note that
C(1) = 0.667, C(2) = 1, C(3) = 0.947, C(4) = 0.524, C(5) = −0.183.
The optimal number of servers is 2 and the optimal profit is $1 per hour.
134
4. (a) The generator matrix is a 4 × 4 matrix.
−1 µ1 µ2 0
1 −1 − µ1 0 µ2
A= . (4.28)
0 0 −1 − µ2 µ1
0 1 1 −µ1 − µ2
(b) From the third equation, we have
µ1
p(0,1) = p(1,1).
1 + µ2
From the fourth equation
µ1
p(1,0) = (µ1 + µ2 − )p(1,1).
1 + µ2
From the first equation
µ1µ2(µ1 + µ2 + 2)
p(0,0) = p(1,1).
1 + µ2
Finally we have
( )−1
µ1 µ1 µ1µ2(µ1 + µ2 + 2)
B(µ1, µ2) = p(1,1) = 1+ + (µ1 + µ2 − )+ .
1 + µ2 1 + µ2 1 + µ2
(c) It is clear that if µ1 > µ2 then B(µ1, µ2) < B(µ2, µ1).
135
5. (a) The Markov chain is as follows.
µ1 µ1 µ1
E0,n+1 - E1,n - · · · En,1 - En+1,0
µ2 µ2 µ2
Hence we have
(1 − ρ)ρi
pi,n+1−i = i = 0, 1, . . . , n + 1.
1 − ρn+2
136