SlideShare a Scribd company logo
1 of 45
Download to read offline
ELL 785–Computer Communication Networks
Lecture 3
Introduction to Queueing theory
3-1
Contents
Review on Poisson process
Discrete-time Markov processes
Continuous-time Markov processes
Queueing systems
3-2
Review on Poisson process I
Properties of a Poisson process, Λ(t):
P1) Independent increment for some finite λ (arrivals/sec):
Number of arrivals in disjoint intervals, e.g., [t1, t2] and [t3, t4], are
independent random variables. Its probability density function is
Pr[Λ(t) = k] =
(λt)k
k!
e−λt
for k = 0, 1, . . .
P2) Stationary increments:
The number of events (or arrivals) in (t, t + h] is independent of t.
Using the prob. generating function of distribution Λ(t + h), i.e.,
E[zΛ(t)
] =
∞
k=0 zk
Pr[Λ(t) = k] = eλt(z−1)
,
E[zΛ(t+h)
] = E[zΛ(t)
· zΛ(t+h)−Λ(t)
]
= E[zΛ(t)
] · E[zΛ(t+h)−Λ(t)
], due to P1.
⇒ E[zΛ(t+h)−Λ(t)
] =
eλ(t+h)(z−1)
eλ(t)(z−1)
= eλ(h)(z−1)
.
3-3
Review on Poisson process II
P3) Interarrival (or inter-occurrence) times between Poisson arrivals are
exponentially distributed:
Suppose τ1, τ2, τ2, . . . are the epochs of the first, second and third
arrivals, then the interarrival times t1, t2 and t3 are given by
t1 = τ1, t2 = τ2 − τ1 and t3 = τ3 − τ2, generally, tn = τn − τn−1
with τ0 = 0.
time
1. For t1, we have Pr[t1 ≥ t] = Pr[Λ(t) = 0] = e−λt
for t ≥ 0, which
means that t1 is exponentially distributed with mean 1/λ.
2. For t2, we get
Pr[t2 > t|t1 = x] = Pr[Λ(t+x)−Λ(x) = 0] = Pr[Λ(t) = 0] = e−λt
,
which also means that t2 is independent of t1 and has the same
distribution as t1. Similarly t3, t4, . . . are iid.
3-4
Review on Poisson process III
P4) The converse of P4 is true:
If the sequence of interarrival times {ti} is iid rv’s with exp. density
fun. λe−λt
, t ≥ 0, then the number of arrivals in interval [0, t],
Λ(t), is a Poisson process.
Let Y denote the sum of j independent rv’s with exp. density fun.,
then Y is Erlang-j distributed, fY (y) = λ(λy)j−1
(j−1)! e−λy
:
Pr[Λ(t) = j] =
t
0
Pr[0 arrival in(y, t]|Y = y]fY (y)dy
=
t
0
eλ(t−y)
· fY (y)dy =
(λt)j
e−λt
j!
.≈
time
3-5
Review on Poisson process IV
P5) For a short interval, the probability that an arrival occurs in an
interval is proportional to the interval size, i.e.,
lim
h→0
Pr[Λ(h) = 1]
h
= lim
h→0
e−λh
(λh)
h
= λ.
Or, we have Pr[Λ(h) = 1] = λh + o(h), where limh→0
o(h)
h = 0
P6) The probability of two or more arrivals in an interval of length h
gets small as h → 0. For every t ≥ 0,
lim
h→0
Pr[Λ(h) ≥ 2]
h
= lim
h→0
1 − e−λh
− λhe−λh
h
= 0
3-6
Review on Poisson process V
P7) Merging: If Λi(t)’s are mutually independent Poisson processes
with rates λi’s, the superposition process Λ(t) =
k
i=1 Λi(t) is
a Poisson process with rate λ =
k
i=1 λi
Note: If the interarrival times of the ith stream are a sequence of
iid rv’s but not necessarily exponentially distributed, then Λ(t)
tends to a Poisson process as k → ∞. [D. Cox, Renewal Theory]
…
Merging
…
Splitting
P8) Splitting: If an arrival randomly chooses the ith branch with
probability πi, the arrival process at the ith branch, Λi(t), is
Poisson with rate λi(= πiλ). Moreover, Λi(t) is independent of
Λj(t) for any pair of i and j (i = j).
3-7
Poisson approximation to Binomial distribution
• Poisson distribution approximates the binomial probabilities
– If n is large and p is small, by keeping G = np fixed,
pk =
n
k
pk
(1 − p)n−k
≈
Gk
k!
e−G
for k = 0, 1, . . . ,
– The probability that no events occur in n trials
p0 = (1 − p)n
= 1 −
G
n
n
→ e−G
as n → ∞
– The rest of probabilities can be found
pk+1
pk
=
n
k+1 pk+1
qn−k−1
n
k pkqn−k
=
k!(n − k)!p
(k + 1)!(n − k − 1)!q
=
(n − k)p
(k + 1)q
=
(1 − k/n)α
(k + 1)(1 − G/n)
→
G
k + 1
as n → ∞ =⇒ pk =
Gk
k!
e−G
3-8
Random (or Stochastic) Processes
General notion
• Suppose a random experiment specified by the outcomes ζ from
some sample space S, and ζ ∈ S
• A random process (or stochastic) is a mapping ζ to a function of
time t: X(t, ζ)
– For a fixed t, e.g., t1, t2,...: X(ti, ζ) is random variable
– For ζ fixed: X(t, ζi) is a sample path or realization
0 1 2 n n+1 time
≈
n+2
– e.g. # of frames in a transmitter’s queue, # of rickshaws at IIT
main gate
3-9
Discrete-time Markov process I
A sequence of integer-valued random variables, Xn, n = 0, 1, . . . ,
is called a discrete-time Markov process
If the following Markov property holds
Pr[Xn+1 = j|Xn = i, Xn−1 = in−1, . . . , X0 = i0]
= Pr[Xn+1 = j|Xn = i]
• State: the value of Xn at time n in the set S
• State space: the set S = {n|n = 0, 1, . . . , }
– An integer-valued Markov process is called Markov chain (MC)
Time-homogeneous, if for any n,
pij = Pr[Xn+1 = j|Xn = i] (indepedent of time n)
which is called one-step (state) transition probability
3-10
Discrete-time Markov process II
State transition probability matrix:
P = [pij] =










p00 p01 p02 · · · · · ·
p10 p11 p12 · · · · · ·
· · · · · · · · · · ·
pi0 pi1 pi2 · · · · · ·
...
...
...
...
...










which is called a stochastic matrix with pij ≥ 0 and
∞
j=0 pij = 1
n-step transition probability matrix:
Pn
= p
(n)
ij
with n-step transition probabilities
p
(n)
ij = Pr[Xl+n = j|Xl = i] for n ≥ 0, i, j ≥ 0.
3-11
Discrete-time Markov process III
The Chapman-Kolmogorov equations:
p
(n+m)
ij =
∞
k=0
p
(n)
ik p
(m)
kj for n, m ≥ 0, i, j ∈ S
Proof:
Pr[Xn+m = j|X0 = i] =
k∈S
Pr[Xn+m = j|X0 = i, Xn = k] Pr[Xn = k|X0 = i]
(Markov property) =
k∈S
Pr[Xn+m = j|Xn = k] Pr[Xn = k|X0 = i]
(Time homogeneous) =
k∈S
Pr[Xm = j|X0 = k] Pr[Xn = k|X0 = i]
Pn+m
= Pn
Pm
⇒ Pn+1
= Pn
P
3-12
Discrete-time Markov process IV
A mouse in a maze
4 5 6
8
1 2 3
7 9
• A mouse chooses the next cell to visit with
probability 1/k, where k is the number of
adjacent cells.
• The mouse does not move any more once it is
caught by the cat or it has the cheese.
P =
1 2 3 4 5 6 7 8 9






















1 0 1
2
0 1
2
0 0 0 0 0
2 1
3
0 1
3
0 1
3
0 0 0 0
3 0 1
2
0 0 0 1
2
0 0 0
4 1
3
0 0 0 1
3
0 1
3
0 0
5 0 1
4
0 1
4
0 1
4
0 1
4
0
6 0 0 1
3
0 1
3
0 0 0 1
3
7 0 0 0 0 0 0 1 0 0
8 0 0 0 0 1
3
0 1
3
0 1
3
9 0 0 0 0 0 0 0 0 1
3-13
Discrete-time Markov process V
In a place, the weather each day is classified as sunny, cloudy or rainy. The
next day’s weather depends only on the weather of the present day and not
on the weather of the previous days. If the present day is sunny, the next
day will be sunny, cloudy or rainy with respective probabilities 0.70, 0.10
and 0.20. The transition probabilities are 0.50, 0.25 and 0.25 when the
present day is cloudy; 0.40, 0.30 and 0.30 when the present day is rainy.
0.7
0.2
0.1
Sunny Cloudy Rainy
0.5
0.25
0.25
0.4
0.3
0.3
P =
S C R




S 0.7 0.1 0.2
C 0.5 0.25 0.25
R 0.4 0.3 0.3
– Using n-step transition probability matrix,
P3
=



0.601 0.168 0.230
0.596 0.175 0.233
0.585 0.179 0.234


 and P12
=



0.596 0.172 0.231
0.596 0.172 0.231
0.596 0.172 0.231


 = P13
3-14
Discrete-time Markov process VI
State probabilities at time n
– π
(n)
i = Pr[Xn = i] and π(n)
= π
(n)
0 , . . . , π
(n)
i , . . .](row vector)
– π
(0)
i : the initial state probability
Pr[Xn = j] =
i∈S
Pr[Xn = j|X0 = i] Pr[X0 = i]
π
(n)
j =
i∈S
p
(n)
ij π
(0)
i
– In matrix notation: π(n)
= π(0)
Pn
Limiting distribution: Given an initial prob. distribution, π(0)
,
π = lim
n→∞
π(n)
→ π
(∞)
j = lim
n→∞
p
(n)
ij
– n → ∞: π(n)
= π(n−1)
P → π = πP and π · 1 = 1
– The system reaches “equilibrium" or “steady-state"
3-15
Discrete-time Markov process VII
Stationary distribution:
– zj and z = [zj] denote the prob. of being in state j and its vector
z = z · P and z · 1 = 1
• If zj is chosen as the initial distribution, i.e., π
(0)
j = zj for all j, we
have π
(n)
j = zj for all n
• A limiting distribution, when it exists, is always a stationary
distribution, but the converse is not true
P =
0 1
1 0
, P2
=
1 0
0 1
, P3
=
0 1
1 0
Global balance equation:
π = πP ⇒ (each row) πj
i
pji =
i
πipij
3-16
Discrete-time Markov process VIII
Back to the weather example on page 3-16
• Using πP = π, we have
π0 =0.7π0 + 0.5π1 + 0.4π2
π1 =0.1π0 + 0.25π1 + 0.3π2
π2 =0.2π0 + 0.25π1 + 0.3π2
- Note that one equation is always redundant
• Using 1 = π0 + π1 + π2, we have



0.3 −0.5 −0.4
−0.1 0.75 −0.3
1 1 1






π0
π1
π2


 =



0
0
1



π0 = 0.596, π1 = 0.1722, π2 = 0.2318
3-17
Discrete-time Markov process IX
Classes of states:
• State j is accessible from state i if p
(n)
ij > 0 for some n
• States i and j communicate if they are accessible to each other
• Two states belong to the same class if they communicate with
each other
• MC having a single class is said to be irreducible
1 2
0
3
Recurrence property
• State j is recurrent if
∞
n=1 p
(n)
jj = ∞
– Positive recurrent if πj > 0
– Null recurrent if πj = 0
• State j is transient if
∞
n=1 p
(n)
jj < ∞
3-18
Discrete-time Markov process X
Periodicity and aperiodic:
• State i has period d if
p
(n)
ii = 0 when n is not a multiple of d,
where d is the largest integer with this property.
• State i is aperiodic if it has period d = 1.
• All states in a class have the same period
– An irreducible Markov chain is said to be aperiodic if the states
in its single class have period one
State
Recurrent
Transient:
Positive recurrent
Periodic:
Aperiodic:
Null recurrent:
ergodic
3-19
Discrete-time Markov process XI
In a place, a mosquito is produced every hour with prob. p, and dies
with prob. 1 − p
0 1 2 3 … …
• Using global balance equations,
pπi = (1 − p)πi+1 → πi+1 =
p
1 − p
πi =
p
1 − p
i
π0
All states are positive recurrent if p < 1/2, null recurrent if
p = 1/2 (see
∞
i=0 πi = 1), and transient if p > 1/2
3-20
Drift and Stability I
Suppose an irreducible, aperiodic, discrete-time MC
• The chain is ‘stable’ if πj > 0 for all j
• Drift is defined as
Di = E[Xn+1 − Xn|Xn = i] =
∞
k=−i
kPi(i+k)
– If Di > 0, the process goes up some higher states from state i
– If Di < 0, the process visits some lower states from state i
– In the previous slide, Di = 1 · p − 1 · (1 − p) = 2p − 1
Pakes’ lemma
1) Di < ∞ for all i
2) For some scalar δ > 0 and integer i ≥ 0
Di ≤ −δ for all i > i
Then, the MC has a stationary distribution
3-21
Drift and Stability II
Proof: β = maxi≤i Di (on page 264 in the textbook)
E[Xn|X0] − i = E[Xn − Xn−1 + Xn−1 − Xn−2 + · · · + X1 − X0|X0 = i]
=
n
k=1
E[Xk − Xk−1|X0 = i]
=
n
k=1
∞
j=0
E[Xk − Xk−1|Xk−1 = j] Pr[Xk−1 = j|X0 = i]
≤
n
k=1
β
i
j=0
Pr[Xk−1 = j|X0 = i]
− δ 1 −
i
j=0
Pr[Xk−1 = j|X0 = i]
= (β + δ)
n
k=1
i
j=0
Pr[Xk−1 = j|X0 = i] − nδ
3-22
Drift and Stability III
from which we can get
0 ≤ E[Xn|X0 = i] ≤ n (β + δ)
i
j=0
1
n
n
k=1
Pr[Xk−1 = j|X0 = i]
p
(k)
ij
−δ + i
Dividing by n and as n → ∞ yields
0 ≤ (β + δ)
i
j=0
πj − δ
– πj = limn→∞
1
n
n
k=1 p
(k)
ij (Cesaro limit) = limn→∞ p
(n)
ij
For some j ∈ {0, . . . , i}, we have πj > 0
3-23
Computational methods I
Infinite-state MC
• Probability generating function (PGF) of a probability distribution
G(z) = E[zX
] =
∞
i=0 zi
Pr[X = i]
πi
– G(1) = 1
– dG(z)/dz|z=1 = X, and d2
G(z)/dz2
|z=1 = X2 − X
• In example on slide 3-22,
G(z) = π0
∞
i=0
pz
1 − p
i
=
1 − p
(1 − p) − zp
π0
Finite-state MC
• Direct method
– π = πP → π(P − I) = 0 and π · 1 = 1 → π · E = 1
– E is a matrix of all ones
π(P + E − I) = 1 ⇒ π = (P + E − I)−1
1
3-24
Computational methods II
• Successive overrelaxation
πi =
N
k=0
pkiπk −→ πi =
N
k=0,k=i
akiπk, aki = pki/(1 − pii)
1. Choose π
(1)
i such that
N
i=0
π
(1)
i = 1, and 0 ≤ ω ≤ 2
2. For each iteration k, compute
π
(k)
i = (1 − ω)π
(k−1)
i + ω
i−1
j=0
aijπ
(k)
j +
N
j=i+1
aijπ
(k−1)
j
For ω = 1, this iteration becomes Gauss-Seidel method
3. Check if the following is satisfied
N
i=0
π
(k)
i − π
(k−1)
i ≤
N
i=0
π
(k)
i
go to Step 4. Otherwise go to Step 2.
4. Compute the state probabilities as
π∗
i = π
(k)
i /
N
j=0
π
(k)
j
3-25
Speech model I
A Markov model for packet speech assumes that if the nth packet
contains silence, then the probability of silence in the next packet is
1 − α and the probability of speech activity is α. Similarly, if the nth
packet contains speech activity, then the probability of speech activity
in the next packet is 1 − β and the probability of silence is β.
≈
≈
talkspurt silence
A frame
time
• Find the state transition probability matrix, P
P =
silence talkspurt
silence 1 − α α
talkspurt β 1 − β
3-26
Speech model II
What if a frame is generated based on the previous Markov speech
model at a transmitter during talkspurts with probability while each
frame is successfully transmitted with probability p at each time slot?
Assume that ACK/NAK come before the beginning of the next slot
• Two-dimensional Markov chain
– {(k, i): k is # of frames and i is the source state}
– Queue length evolves as
Qn+1 = max(Qn − Tn, 0) + An
… …
3-27
Simplified Google page rank model
A Web surfer browses pages in a five-page Web universe shown below.
The surfer selects the next page to view by selecting with equal
probability from the pages pointed to by the current page. If a page
has no outgoing link (e.g., page 2), then the surfer selects any of the
pages in the universe with equal probability. Find the probability that
the surfer views page i
1 4
3
2 5
3-28
Summary of discrete-time MC (DTMC)
Stochastic modeling with a DTMC
Inspect whether Xn is a Markov process or not
Build a state transition probability matrix, P (or diagram)
Examine under what condition Xn is stable
– e.g., use the drift for n ≥ n ,
Dn(= expected input rate - expected output rate) < 0
Solve π · P = π and π · 1 = 1 or πj = i πipij
Compute other metrics with π, when necessary
3-29
Continuous-time Markov process I
Memoryless property of exponential dist., F(x) = 1 − e−µx
for x ≥ 0.
Pr[X ≤ x0 + x|X > x0] =
Pr[x0 < X ≤ x0 + x]
Pr[X > x0]
=
Pr[X ≤ x0 + x] − Pr[X ≤ x0]
1 − Pr[X ≤ x0]
= 1 − e−µx
= Pr[X ≤ x],
which is not a function of x0.
Example: Suppose X is the length of a telephone conversation
exponentially distributed with mean 5 minutes, (µ = 1/5). Given that the
conversation has been going on for 20 minutes (x0 = 20), the probability
that it continues for at most 10 minutes (x = 10), is given by
Pr[X ≤ x0 + x|X > x0] = Pr[X ≤ x] = 1 − e−10/5
.
In fact, because of this property, we shall see that the Markovian queueing
system can be completely specified by the number of customers in the
system. A similar result holds for the geometric distribution.
3-30
Continuous-time Markov process II
A stochastic process is called continuous-time MC if it satisfies
Pr[X(tk+1) = xk+1|X(tk) = xk, X(tk−1) = xk−1, . . . , X(t1) = x1]
= Pr[X(tk+1) = xk+1|X(tk) = xk]
If X(t) is a time-homogeneous continuous-time MC if
Pr[X(t + s) = j|X(s) = i] = pij(t) (independent of s)
which is analogous to pij in a discrete-time MC
time
1
2
3
4
: sojourn time in state
: time of state change
A sample path of continuous time MC 3-31
Continuous-time Markov process III
State occupancy time
– Let Ti be the sojourn (or occupancy) time of X(t) in state i before
making a transition to any other state. For all s ≥ 0 and t ≥ 0, due
to Markovian property of this process,
Pr[Ti > s + t|Ti > s] = Pr[Ti > t].
Only exponential dist. satisfies this property, i.e., Pr[Ti > t] = e−vi t
.
State transition rate
qii(δ) = Pr[the process remains in state i during δ sec]
= Pr[Ti > δ] = e−vi δ
= 1 −
viδ
1
+
(viδ)2
2!
− · · · = 1 − viδ + o(δ)
Or, equivalently, the rate that the process moves out of state i,
lim
δ→0
1 − qii(δ)
δ
= lim
δ→0
viδ + o(δ)
δ
= vi
3-32
Continuous-time Markov Process IV
Comparison between discrete- and continuous time MC
time
1
2
8
: sojourn time in state
: time of state change
≈
≈
time
1
8
≈
≈
≈
: discrete-time Markov process
: continuous-time Markov process
3-33
Continuous-time Markov Process V
A discrete-time MC is embedded in a continuous-time MC.
time
1
2
4
: continuous-time Markov process
3
Each time a state, say i, is entered, an exponentially distributed state
occupancy time is selected. When the time is up, the next state j is
selected according to transition probabilities, ˜pij
When the process enters state j from state i,
qij(δ) = (1 − qjj(δ))˜pij = vi ˜pijδ + o(δ) = γijδ + o(δ),
where γij = limδ→0 qij(δ)/δ = vi ˜pij, i.e., rate from state i to j.
3-34
Continuous-time Markov process V
State probabilities πj(t) = Pr[X(t) = j]. For δ > 0,
πj(t + δ) = Pr[X(t + δ) = j]
=
i
Pr[X(t + δ) = j|X(t) = i]
=qij (δ)
Pr[X(t) = i]
=
i
qij(δ)πi(t) ⇐⇒ πi =
j
pjiπj (DTMC)
Transition into state j
time
state
3-35
Continuous-time Markov process VI
Subtracting πj(t) from both sides,
πj(t + δ) − πj(t) =
i
qij(δ)πi(t) − πj(t)
=
i
qij(δ)πi(t) + (qjj(δ) − 1)πj(t)
Dividing both sides by δ,
lim
δ→0
πj(t + δ) − πj(t)
δ
=
dπj(t)
dt
= lim
δ→0
1
δ i
qij(δ)πi(t) + (qjj(δ) − 1)
γii =−vi
πj(t) =
i
γijπi(t),
which is a form of the Chapman-Kolmogorov equations
dπj(t)
dt
=
i
γijπi(t)
3-36
Continuous-time Markov process VI
A queueing system alternates between two states. In state 0, the
system is idle and waiting for a customer to arrive. This idle time is an
exponential random variable with mean 1/α. In state 1, the system is
busy servicing a customer.The time in the busy state is an exponential
random variable with mean 1/β. Find the state probabilities and in
terms of the initial state probabilities π0(0) and π1(0).
• γ00 = −α, γ01 = α, γ10 = β, γ11 = −β
• From
dπj (t)
dt = i γijπi(t),
π0(t) = −απ0(t) + βπ1(t)
π1(t) = απ0(t) − βπ1(t)
• Using π0(t) + π1(t) = 1, we have
π0(t) = −απ0(t) + β(1 − π0(t)) and π0(0) = p0
• The general solution of the above is
π0(t) =
β
α + β
+ Ce−(α+β)t
and C = p0 −
β
α + β 3-37
Continuous-time Markov process VII
As t → ∞, the system reaches ‘equilibrium’ or ‘steady-state’
dπj(t)
dt
→ 0 and πj(∞) = πj
0 =
i
γijπi or vjπj =
i=j
γijπi γjj = −vj = −
i=j
γij
which is called the global balance equation, and j πj = 1.
… … ……
3-38
Continuous-time Markov process VIII
As a matrix form,
dπ(t)
dt
= π(t)Q and π(t)1 = 1
whose solution is given by
π(t) = π(0)eQt
As t → ∞, π(∞) π = [πi],
πQ = 0 with Q =





−v0 γ01 γ02 γ03 . . .
γ10 −v1 γ12 γ13 . . .
γ20 γ21 −v2 γ23 . . .
...
...
...
...
...





and π · 1 = 1,
where Q is called the infinitesimal generator or rate matrix.
3-39
Example: Barber shop I
Customers arrive at a Barbor shop with a Poisson process with rate λ.
One barber serves those customers based on first-come first-serve
basis. Its service time, Si is exponentially distributed with 1/µ (sec).
The number of customers in the system, N(t) for t ≥ 0, forms a
Markov chain
N(t + τ) = max(N(t) − B(τ), 0) + A(τ)
State transition probabilities (see properties of Poisson process):
Pr[0 arrival (or departure) in (t, t + δ)]
= 1 − λδ + o(δ) (or 1 − µδ + o(δ))
Pr[1 arrival (or deparutre) in (t, t + δ)]
= λδ + o(δ) (or µδ + o(δ))
Pr[more than 1 arrivals (or departure) in (t, t + h)] = o(h)
3-40
Example: Barber shop II
Find Pn(t) Pr[N(t) = n]. For n ≥ 1
Pn(t + δ) = Pn(t) Pr[0 arrival & 0 departure in (t, t + δ)]
+ Pn−1(t) Pr[1 arrival & 0 departure in (t, t + δ)]
+ Pn+1(t) Pr[0 arrival & 1 departure in (t, t + δ)] + o(h)
= Pn(t)(1 − λδ)(1 − µδ) + Pn−1(t)(λδ)(1 − µδ)
+ Pn+1(t)(1 − λδ)(µδ) + o(δ).
Rearranging and dividing it by δ,
Pn(t + δ) − Pn(t)
δ
= −(λ + µ)Pn(t) + λPn−1(t) + µPn+1(t) +
o(δ)
δ
As δ → 0, for n > 0 we have
dPn(t)
dt
= − (λ + µ)
rate out of state n
Pn(t) + λ
rate from state n − 1 to n
Pn−1(t)
+ µ
rate from state n + 1 to n
Pn+1(t).
3-41
Example: Barber shop III
For n = 0, we have
dP0(t)
dt
= −λP0(t) + µP1(t).
As t → ∞, i.e., steady-state, we have Pn(∞) = πn with dPn(t)
dt = 0.
λπ0 = µπ1
(λ + µ)πn = λπn−1 + µπn+1 for n ≥ 1.
State transition rate diagram
……
Solution of the above equations is (ρ = λ/µ)
πn = ρn
π0 and 1 = π0 1 +
∞
i=1
ρi
⇒ π0 = 1 − ρ 3-42
Example: Barber shop IV
ρ: the server’s utilization (< 1, i.e., λ < µ)
Mean of customers in the system
E[N] =
∞
n=0
nπn =
ρ
1 − ρ
= ρ(in server) + ρ2
/(1 − ρ)(in queue)
An M/M/1 system with 1/µ = 1
ρ
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
Numberofcustomersinthesystem
0
5
10
15
20
Simulation
Analysis
ρ
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
Meansystemresponsetime(sec)
0
5
10
15
20
Simulation
Analysis
3-43
Example: Barbershop V
Recall the state transition rate matrix, Q on page 3-40 as
πQ = 0 with Q =





−v0 γ01 γ02 γ03 . . .
γ10 −v1 γ12 γ13 . . .
γ20 γ21 −v2 γ23 . . .
...
...
...
...
...





and π · 1 = 1,
– What is γij, and vi in M/M/1 queue ?
γij =



λ, if j = i + 1,
µ, if j = i − 1.
−(λ + µ), if j = i
0, otherwise
If a and b denote interarrival and service time, respectively, then vi is the
mean of an exponential distribution, i.e., min(a, b).
What is pi,i+1 or pi+1,i?
pi,i+1 = Pr[a < b] = λ/(λ+µ) and pi+1,1 = Pr[b < a] = µ/(λ+µ).
3-44
Example: Barbershop VI
Distribution of sojourn time, T:
TN = S1 + S2 + · · · + SN
customers ahead
+SN+1
An arriving customer finds N customers in the system (including the
customer in the server)
– By the memoryless property of the exponential distribution, the
remaining service time of the customer in service is exponentially
distributed:
fT (t) =
∞
i=0
µ
(µt)i
i!
e−µt
πi
=
∞
i=0
µ
(µt)i
i!
e−µt
ρi
(1 − ρ) = µ(1 − ρ)e−µ(1−ρ)t
which can be obtained via Laplace transform of distribution of Si.
3-45
Barbershop simulation I
Discrete event simulation
Generate an arrival
sim_time = sim_time + interarrival time
sim_time = 0
Queue = Queue + 1
Scheduling
an eventnext interarrival time
< service time
Arrival? yes
Queue = Queue ─ 1
service time < next interarrival time
Queue is
empty?
Yes
sim_time = sim_time + service time
No
3-46
Barbershop simulation II
clear
% Define variables
global arrival departure mservice_time
arrival = 1; departure = -1; mservice_time = 1;
% Set simulation parameters
sim_length = 30000; max_queue = 1000;
% To get delay statistics
system_queue = zeros(1,max_queue);
k = 0;
for arrival_rate = 0.1:0.025:0.97
k = k + 1;
% x(k) denotes utilization
x(k) = arrival_rate*mservice_time;
% initialize
sim_time = 0; num_arrivals = 0; num_system =0; upon_arrival = 0; total_delay = 0; num_served =0;
% Assuming that queue is empty
event = arrival; event_time = exprnd(1/arrival_rate);
sim_time = sim_time + event_time;
while (sim_time < sim_length),
% If an arrival occurs,
if event == arrival
num_arrivals = num_arrivals + 1;
num_system = num_system + 1;
% Record arrival time of the customer
system_queue(num_system) = sim_time;
upon_arrival = upon_arrival + num_system;
% To see whether one new arrival comes or new departure occurs
[event, event_time] = schedule_next_event(arrival_rate);
3-47
Barbershop simulation III
% If a departure occurs,
elseif event == departure
delay_per_arrival = sim_time - system_queue(1);
system_queue(1:max_queue-1) = system_queue(2:max_queue);
total_delay = total_delay + delay_per_arrival;
num_system = num_system - 1;
num_served = num_served + 1;
if num_system == 0
% nothing to serve, schedule an arrival
event = arrival;
event_time = exprnd(1/arrival_rate);
elseif num_system > 0
% still the system has customers to serve
[event, event_time] = schedule_next_event(arrival_rate);
end
end
sim_time = sim_time + event_time;
end
ana_queue_length(k) = (x(k)/(1-x(k)));
ana_response_time(k) = 1/(1/mservice_time-arrival_rate);
% Queue length seen by arrival
sim_queue_length(k) = upon_arrival/sim_length;
sim_response_time(k) = total_delay/num_served;
end
3-48
Barbershop simulation IV
function [event, event_time] = schedule_next_event(arrival_rate)
global arrival departure mservice_time
minter_arrival = 1/arrival_rate;
inter_arrival = exprnd(minter_arrival);
service_time = exprnd(mservice_time);
if inter_arrival < service_time
event = arrival;
event_time = inter_arrival;
else
event = departure;
event_time = service_time;
end
3-49
Relation between DTMC and CTMC I
Recall an embedded MC: each time a state, say i, is entered, an
exponentially distributed state occupancy time is selected. When the time
is up, the next state j is selected according to transition probabilities, ˜pij
time
1
2
4
: continuous-time Markov process
3
• Ni(n): the number of times state i occurs in the first n transitions
• Ti(j): the occupancy time the jth
time state i occurs.
The proportion of time spent by X(t) in state i after the first n transitions
time spent in state i
time spent in all states
=
Ni (n)
j=1
Ti(j)
i
Ni (n)
j=1
Ti(j)
3-50
Relation between DTMC and CTMC II
As n → ∞, using πi = Ni(n)/n we have
Ni (n)
n
1
Ni (n)
Ni (n)
j=1 Ti(j)
i
Ni (n)
n
1
Ni (n)
Ni (n)
j=1 Ti(j)
=
πiE[Ti]
i πiE[Ti] E[Ti ]=1/vi
= φi,
where πi is the unique pmf solution to
πj =
i
πi˜pij and
j
πj = 1 (∗)
The long-term proportion of time spent in state i approaches
φi =
πi/vi
i πi/vi
= c
πi
vi
→ πi =
viφi
c
Substituting πi = (viφi)/c into (∗) yields
viφi
c
=
1
c i
viφi˜pij → viφi =
i
φivi˜pij =
i
φiγij
3-51
Relation between DTMC and CTMC III
Recall M/M/1 queue
0 1 32 4
0 1 32 4
… …
… …
a) CTMC
b) Embedded MC
In the embedded MC, we have the following global balance equations
π0 = qπ1
π1 = π0 + qπ2
...
πi = pπi−1 + qπi+1



πi =
p
q
πi−1
3-52
Relation between DTMC and CTMC IV
Using the normalization condition,
∞
i=0 πi = 1,
πi =
p
q
i−1
1
q
π0 and π0 =
1 − 2p
2(1 − p)
Converting the embedded MC into CTMC,
φ0 =
c
v0
π0 =
c
λ
π0 and φi =
cπi
vi
=
c
λ + µ
πi
To determine c,
∞
i=0
φi = 1 → c
π0
λ
+
1
λ + µ
∞
i=1
πi = 1 → c = 2λ
Finally, we get φi = ρi
(1 − ρ) for i = 1, 2, . . .
3-53
Example of using embedded MC I
A stray dog, in front of the tea shop at the central library of IIT Delhi,
spends most of the daytime sleeping around the tea shop. When a
person comes to the tea shop, the dog greets him or her and wags her
tail for an average time of one minute. At the end of this period, this
dog is fed with probability 1/4, patted briefly with probability 5/8, or
taken for a walk with probability 1/8. If fed, she spends an average of
two minutes eating. The walks take 15 minutes on average. After
eating, being patted, or walking, she returns to sleep. Assume that
people come to the tea shop on average every hour.
1. Find a Markov chain model with four states, {sleep, greet, eat,
walk}: Specify the transition probabilities and rates
2. Find the steady state probabilities that you find the dog’s state
3-54
Example of using embedded MC II
• State transition diagram
sleep greet
eat
walk
1
1
1/4
1/8
5/8
P =
S G E W






S 0 1 0 0
G 5/8 0 1/4 1/8
E 1 0 0 0
W 1 0 0 0
• From πP = π and π · 1 = 1, we have
π0 = π1 = 8/19, π2 = 2/19, π3 = 1/19
• From φi = (πi/vi)/( k πk/vk), we have, e.g.,
φ0 =
60π0
60π0 + π1 + 2π2 + 15π3
and φ2 =
2π2
60π0 + π1 + 2π2 + 15π3
3-55
Queueing systems I
The arrival times, the size of demand for service, the service capacity
and the size of waiting room may be (random) variables.
Queueing discipline: specify which customer to pick next for service.
• First come first serve (FCFS, or FIFO)
• Last come first serve (LCFS, LIFO)
• Random order, Processor sharing (PS), Round robin (RR)
• Priority (preemptive:resume, non-resume; non-preemptive)
• Shortest job first (SJF) and Longest job first (LJF)
3-56
Queueing systems II
Customer behavior: jockeying, reneging, balking, etc.
Kendall’s notation:
Population size (default )
Queue size (default )
# of servers
Service time distribution
Arrival time distribution
For A and B:
• M: Markovian, exponential dist.
• D: Deterministic
• GI: General independent
• Ek: Erlang-k
• Hk: Mixture of k exponentials
• PH: Phase type distribution
E.g.: M/D/2, M/M/c, G/G/1, etc.; Barbershop is M/M/1 queue.
3-57
Queueing system III
Performance measure:
• N(t) = Nq(t) + NS(t): number in
system
• Nq(t): number in queue
• NS(t): number in service
• W : Waiting time in queue
• T: total time (or response time) in
the system
• τ: service time
• Throughput: γ mean # of customers served per unit time
1. γ for non-blocking system = min(λ, mµ)
2. γ for a blocking system = (1 − PB)λ, PB = blocking probability
• Utilization: ρ fraction of time server is busy
ρ =
load
capacity
= lim
T→∞
λT
µT
=
λ
µ
for a single server queue
= lim
T→∞
λT
mµT
=
λ
mµ
for an m-server queue
3-58
Little’s theorem I
Any queueing system in steady state: N = λT
1
2
4
3
5
6
Numberofarrivalsordepartures
time
Customer 1
Customer 2
• N: average number
of customers in the
system
• λ: steady-state
arrival rate, need not
to be a Poisson
• T: average delay per
customer
Proof: For a system with N(0) = 0 and N(t) = 0, as t → ∞
Nt =
1
t
t
0
N(τ)dτ =
1
t
α(t)
i=1
Ti =
α(t)
t
α(t)
i=0 Ti
α(t)
= λt · Tt.
If N(t) = 0, we have β(t)
t
β(t)
i=0
Ti
β(t) ≤ Nt ≤ λtTt.
3-59
Little’s theorem II
As an alternative, for the cumulative processes,
N(t) = α(t) − β(t) = γ(t) −→
divided by t
N(t)/t = γ(t)/t = Nt
See the variable, ‘num_system’ in the previous Matlab code
‘num_arrvials’ in the code (t corresponds to ‘sim_length’)
λt = α(t)/t
Response time per customer from ‘total_delay’
Tt =
γ(t)
α(t)
=
γ(t)
t
·
t
α(t)
=
Nt
λt
As t → ∞, we have
λT = λ(W + x) = Nq + ρ
valid for any queue (even with any service order) as long as the limits
of λt and Tt exist as t → ∞
3-60
Little’s theorem III
Finite queue
… …
Network of queues
3-61
Increasing the arrival and transmission rates by the
same fator
In a packet transmission system,
• Arrival rate (packets/sec) is increased from λ to Kλ for K > 1
• The packet length distribution remains the same (exponential),
with mean 1/µ bits
• The transmission capacity (C bps) is increased by a factor of K
Performance
• The average number of packets in the system remain the same
N =
ρ
1 − ρ
with ρ = λ/(µC)
• Average delay per packet
λW = N → W = N/(Kλ)
Aggregation is better: increasing a transmission line by K times can
allow K times as many packets/sec with K times smaller average
delay per packet
3-62
Statistical multiplexing vs TDMA or FDMA
Multiplexing: m Poisson packet streams each with λ/m (packets/sec)
are transmitted over a communication link with 1/µ exponentially
distributed packet transmission time
…
…
a) Statistical multiplexing b) TDMA or FDMA
T =
1
µ − λ
< T =
m
µ − λ
When do we need TDMA or FDMA?
– In a multiplexer, packet generation times overlap, so that it must
buffer and delay some of the packets
3-63
Little’s theorem: example I
Estimating throughput in a time-sharing systemSec. 3.2 Queueing Models-Little's Theorem
Average reflection
time R
Computer
B
Average job processing
time P
C
161
(3.6)
Figure 3.4 N terminals connected with a time-sharing computer system. To
estimate maximum attainable throughput, we assume that a departing user im-
mediately reenters the system or, equivalently, is immediately replaced by a new
user.
Combining this relation with A = NIT [cf. Eq. (3.3)], we obtain
N
-R+P
The throughput A is also bounded above by the processing capacity of the computer. In
particular, since the execution time of a job is P units on the average, it follows that the
computer cannot process in the long run more than II P jobs per unit time, that is,
I
A <--P
(3.7)
(3.8)
(This conclusion can also be reached by applying Little's Theorem between the entry and
exit points of the computer's CPU.)
By combining the preceding two relations, we obtain the bounds
N {I N}---::-:-:c- < A < min - ---
R+NP- - P'R+P
for the throughput A. By using T = N I A, we also obtain bounds for the average user delay
when the system is fully loaded:
max {NP, R+ P} -s; T -s; R+ NP (3.9)
Suppose a time-sharing computer system with N terminals. A user logs
into the system through a terminal and after an initial reflection period of
average length R, submit a job that requires an average processing time P
at the computer. Jobs queue up inside the computer and are served by a
single CPU according to some unspecified priority or time-sharing rule.
What is the maximum of sustainable throughput by the system?
– Assume that there is always a user ready to take the place of a departing
user, so the number of users in the system is always N
3-64
Little’s theorem: example II
The average time a user spends in the system
T = R + D → R + P ≤ T ≤ R + NP
– D: the average delay between time time a job is submitted to the
computer and the time its execution is completed, D = [P, NP]
Combining this with λ = N/T,
N
R + NP
≤ λ ≤ min
1
P
,
N
R + P
– throughput is bounded by 1/P, maximum job execution rate
162
,<
....::J
0.
.r::
Cl
::J
o
1:
I-
'"::c
'"c
:t
11P
Bound induced by
Iimited number
of terminals
Delay Models in Data Networks
Bound induced by
CPU processing
capacity
Guaranteed
throughput
curve
Chap. 3
Upper bound for delay
E
'"1;;
>
en
'".sc
'"E
i=
:::>
'"Cl
:;>
'">
<{
o 1 + RIP
R+P
/1
R
/ I
I
I
1/
V
//1
0
Number of Terminals N
(a)
Lower bound for delay due to limited
CPU processing capacity
Delay assuming no waiting in queue
Number of Terminals N
(b)
Figure 3.5 Bounds on throughput and average user delay in a time-sharing
system. (a) Bounds on attainable throughput [Eq. (3.8)]. (b) Bounds on average
user time in a fully loaded system [Eq. (3.9)]. The time increases essentially in
proportion with the number of terminals N.
bounds obtained are independent of these parameters. We owe this convenient situation to
the generality of Little's Theorem.
3.3 THE M / M /1 QUEUEING SYSTEM
The M / ][/ I queueing system consists of a single queueing station with a single server
(in a communication context, a single transmission line). Customers arrive according
to a Poisson process with rate A, and the probability distribution of the service time is
exponential with mean 1/f.1 sec. We will explain the meaning of these terms shortly.
The name AI/AI/ I reflects standard queueing theory nomenclature whereby:
3-65
Little’s theorem: example III
Using T = N/λ, we can rewrite
max{NP, R + P} ≤ T ≤ R + NP
,<
....::J
0.
.r::
Cl
::J
o
1:
I-
'"::c
'"c
:t
11P
Bound induced by
Iimited number
of terminals
Bound induced by
CPU processing
capacity
Guaranteed
throughput
curve
Upper bound for delay
E
'"1;;
>
en
'".sc
'"E
i=
:::>
'"Cl
:;>
'">
<{
o 1 + RIP
R+P
/1
R
/ I
I
I
1/
V
//1
0
Number of Terminals N
(a)
Lower bound for delay due to limited
CPU processing capacity
Delay assuming no waiting in queue
Number of Terminals N
(b)
Figure 3.5 Bounds on throughput and average user delay in a time-sharing
system. (a) Bounds on attainable throughput [Eq. (3.8)]. (b) Bounds on average
user time in a fully loaded system [Eq. (3.9)]. The time increases essentially in
proportion with the number of terminals N.
bounds obtained are independent of these parameters. We owe this convenient situation to
the generality of Little's Theorem.
3.3 THE M / M /1 QUEUEING SYSTEM
The M / ][/ I queueing system consists of a single queueing station with a single server
(in a communication context, a single transmission line). Customers arrive according
to a Poisson process with rate A, and the probability distribution of the service time is
exponential with mean 1/f.1 sec. We will explain the meaning of these terms shortly.
The name AI/AI/ I reflects standard queueing theory nomenclature whereby:
1. The first letter indicates the nature of the arrival process [e.g., !vI stands for mem-
oryless, which here means a Poisson process (i.e., exponentially distributed inter-
3-66
Poisson Arrivals See Time Average (PASTA) theorem I
Suppose a random process which spends its time in different states Ej
In equilibrium, we can associate with each state Ej two different
probabilities
• The probability of the state as seen by an outside random observer
– πj: prob. that the system is in the state Ej at a random instant
• The probability of the state seen by an arriving customer
– π∗
j : prob. that the system is in the state Ej just before (a
randomly chosen) arrival
In general, we have πj = π∗
j
When the arrival process is Poisson, we have
πj = π∗
j
3-67
PASTA theorem II
For a stochastic process, N ≡ {N(t), t ≥ 0} for t ≥ 0 and an
arbitrary set B ∈ N:
U(t) =
1, if N(t) ∈ B,
0, otherwise.
⇒ V (t) =
1
t
t
0
U(τ)dτ.
For a Poisson arrival process A(t),
Y (t) =
t
0
U(τ)dA(τ) ⇒ Z(t) = Y (t)/A(t)
Lack of Anticipation Assumption (LAA): For each t ≥ 0,
{A(t + u) − A(t), u ≥ 0} and {U(s), 0 ≤ s ≤ t} are independent:
Future inter-arrival times and service times of previously arrived
customers are independent.
Under LAA, as t → ∞, PASTA ensures
V (t) → V (∞) w.p. 1 if Z(t) → V (∞) w.p.1
3-68
PASTA theorem
Proof:
• For sufficiently large n, Y (t) is approximated as
Yn(t) =
n−1
k=0
U(k(t/n))[A((k + 1)t/n) − A(kt/n)
(λ(k+1)t−λkt)/n
]
• LAA decouples the above as
E[Yn(t)] = λtE
n−1
k=0
U(kt/n)/n
• As n → ∞, if |Yn(t)| is bounded,
lim
n→∞
E[Yn(t)] = E[Y (t)] = λtE[V (t)] = λE
t
0
U(τ)dτ .
: the expected number of arrivals who find the system in state B
equals arrival rate times the expected length of time it is there.
3-69
Systems where PASTA does not hold
Ex1) D/D/1 queue
• Deterministic arrivals every 10 msec
• Deterministic service times of 9 msec
0 9 10 19
… …
20
A sample path of D/D/1 queue
• Arrivals always finds the system empty.
• The system is occupied on average with 0.9.
Ex2) LAA violated: Service times for a current customer depends on
an inter-arrival time of a future customer
• Your own PC (one customer, one server)
• Your own PC is always free when you need it, π∗
0 = 1
• π0= proportion of time the PC is free (< 1)
3-70
M/M/1/K I
M/M/1/K: the system can accommodate K customers
… …
waiting customers
• State balance equations
λπ0 = µπ1
(λ + µ)πi = λπi−1 + µπi+1 for 1 ≤ i ≤ K
After rearranging, we have
λπi−1 = µπi for 1 ≤ i ≤ K
• For i ∈ {0, 1, . . . , K}, steady-state probabilities are
πn = ρn
π0 and
K
n=0
πn = 1 ⇒ π0 =
1 − ρ
1 − ρK+1
3-71
M/M/1/K II
• πK : the probability that an arriving customer finds the system full.
Due to PASTA, this is a blocking probability
πK =
1 − ρ
1 − ρK+1
ρK
• Blocking probability in simulation
PB =
total # of blocked arrivals upon arrival instants
total # of arrivals at the system
ρ
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9PB
0
0.025
0.05
0.075
0.1
0.125
0.15
Analysis, K = 10
Simulation, K = 10
Analysis, K = 5
Simulation, K = 5
3-72
M/M/1/K Simulation I
clear
% Define variables
global arrival departure mservice_time
arrival = 1; departure = -1; mservice_time = 1;
% Define simulation parameters
sim_length = 30000; K = 10; system_queue = zeros(1,K);
k = 0; max_iter = 5;
for arrival_rate = 0.1:0.025:0.97
k = k + 1;
x(k) = arrival_rate*mservice_time;
% initialize
sim_time=0; num_arrivals=0; num_system=0; upon_arrival=0; total_delay=0; num_served=0; dropped=0;
% Assuming that queue is empty
event = arrival; event_time = exprnd(1/arrival_rate);
sim_time = sim_time + event_time;
for iter = 1:max_iter
while (sim_time < sim_length),
% If an arrival occurs,
if event == arrival
num_arrivals = num_arrivals + 1;
if num_system == K
dropped = dropped + 1;
else
num_system = num_system + 1;
system_queue(num_system) = sim_time;
upon_arrival = upon_arrival + num_system;
end
% To see whether one new arrival comes or new departure occurs
[event, event_time] = schedule_next_event(arrival_rate);
3-73
M/M/1/K Simulation II
% If a departure occurs,
elseif event == departure
delay_per_arrival = sim_time - system_queue(1);
system_queue(1:K-1) = system_queue(2:K);
total_delay = total_delay + delay_per_arrival;
num_system = num_system - 1;
num_served = num_served + 1;
if num_system == 0
% nothing to serve, schedule an arrival
event = arrival;
event_time = exprnd(1/arrival_rate);
elseif num_system > 0
% still the system has customers to serve
[event, event_time] = schedule_next_event(arrival_rate);
end
end
sim_time = sim_time + event_time;
end
Pd_iter(iter)=dropped/num_arrivals;
end
piK(k) = x(k)^K*(1-x(k))./(1-x(k)^(K+1));
Pd(k) = mean(Pd_iter);
end
%%%%%%%%%%%
%% use the previous schedule_next_event function
3-74
M/M/m queue I
M/M/m: there are m parallel servers, whose service times are
exponentially distributed with mean 1/µ.
…… …
State transition rate diagram of M/M/m
When m servers are busy, the time until the next departure, X, is
X = min(τ1, τ2, . . . , τm) ⇒ Pr[X > t] = Pr[min(τ1, τ2, . . . , τm) > t]
=
m
i=1
Pr[τi > t] = e−mµt
(i.i.d.)
Global balance equations:
λπ0 = µπ1
(λ + min(n, m)µ)πn = λπn−1 + min(n + 1, m)µπn+1 for n ≥ 1
3-75
M/M/m queue II
The previous global balance equation can be rewritten as
λπn−1 = min(n, m)µπn for n ≥ 0
Using a = λ/µ and ρ = λ/mµ
πn = ρmax(0,n−m) am
m!
π0
From the normalization condition, π0 is obtained
1 =
∞
i=0
πi = π0
m−1
i=0
ai
i!
+
am
m!
∞
i=m
ρi−m
Erlang C formula, C(m, a),
C(m, a) = Pr[W > 0] = Pr[N ≥ m] =
∞
i=m
πi =
(mρ)m
m!
·
π0
1 − ρ
3-76
M/M/c/c I
c-server and only c customers can be accommodated
… …
Balance equations are (a = λ/µ called Erlang)
λπn−1 = nµπn ⇒ πn =
a
n
πn−1 =
an
n!
π0
Using
c
n=0 πn = 1, we have
πn =
an
n!
c
i=0
ai
i!
−1
Erlang B formula: B(c, a) = πc
– valid for M/G/c/c system. Note that this depends only on the
mean of service time distribution
3-77
M/M/c/c II
Erlang capacity: Telephone systems with c channels
offered traffic intensity, a
10-1
100
101
B(c,a)
10-4
10-3
10-2
10-1
100
c = 1
2
3
4 5 6 7 8 9 10
offered traffic intensity, a
0 20 40 60 80 100
B(c,a)
10-4
10-3
10-2
10-1
100
10 20 30
40 50 60
70 80 90 100
a
0 0.5 1 1.5 2 2.5 3
PB
10-8
10-7
10-6
10-5
10-4
10-3
10-2
10-1
100
Analysis, c = 3
Simulation, c = 3
Analysis, c = 5
Simulation, c = 5
3-78
M/M/c/c Simulation I
clear
global arrival departure mservice_time
arrival = 1; departure = -1; mservice_time = 1;
sim_length = 50000; n_iter = 5;
K = 5; % number of servers
k = 0;
for arrival_rate = 0.05:0.025:0.95
k = k + 1;
for iter = 1:n_iter
sim_time = 0;
num_busy_servers =0;
block = 0;
num_arrival = 0;
event = arrival;
event_time = exprnd(1/arrival_rate);
while (sim_time < sim_length),
if event == arrival
num_arrival = num_arrival + 1;
%% All servers are working,
if num_busy_servers == K
block = block + 1;
end
%% increase the number of busy servers by 1
num_busy_servers = min(K,num_busy_servers + 1);
[event, event_time] = schedule_next_event_multi(arrival_rate,num_busy_servers);
3-79
M/M/c/c Simulation II
elseif event == departure
num_busy_servers = num_busy_servers - 1;
if num_busy_servers == 0
event = arrival; event_time = exprnd(1/arrival_rate);
else
[event, event_time] = schedule_next_event_multi(arrival_rate,num_busy_servers);
end
end
sim_time = sim_time + event_time;
end
simb(iter) = block/num_arrival;
end
rho = arrival_rate/mservice_time; x(k) = rho;
anab(k) = (rho^K/factorial(K))/(sum((rho.^[0:K])./factorial([0:K])));
Pb(k) = mean(simb);
end
%%%%%%%%%%%% A new function starts here %%%%%%%%%%%%
function [event, event_time] = schedule_next_event_multi(arrival_rate,num_busy_servers)
global arrival departure mservice_time
inter_arrival = exprnd(1/arrival_rate);
multi_service = exprnd(mservice_time,[1 num_busy_servers]);
service_time = min(multi_service);
if inter_arrival < service_time
event = arrival; event_time = inter_arrival;
else
event = departure;
event_time = service_time;
end
3-80
Example: a system with blocking I
In Select-city shopping mall, customers arrive at the underground
parking lot of it according to a Poisson process with a rate of 60 cars
per hour. Parking time follows a Weibull distribution with mean 2.5
hours and the parking lot can accommodate 150 cars. When the
parking lot is full, an arriving customer has to park his car somewhere
else. Find the fraction of customers finding all places occupied upon
arrival
x (hours)
0 1 2 3 4 5 6 7 8
f(x)
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
Weibull: α = 2.7228, k = 5
f(x) = k
α
k
α
k−1
e(x
α )
k
f(x) = 1
α
e− x
α
exponential
two different distributions with the same mean
– Mean of Weibull distribution: αΓ(1 + 1/k), and Γ(x) =
∞
0
tx−1e−tdt is called
the gamma function 3-81
Example: a system with blocking II
• c = 150 and a = λ/µ = 60 × 2.5 = 150
B(c, a)
c=150,a=150
=
ac
c!
c
i=0
ai
i!
• Divide the numerator and denominator by
c−1
n=0 an
/n!,
B(c, a) =
ac
c!
c−1
i=0
ai
i! + ac/c!
=
(ac
/c!)/
c−1
n=0 an
/n!
1 + (ac/c!)/
c−1
n=0 an/n!
=
(a/c)B(c − 1, a)
1 + (a/c)B(c − 1, a)
=
aB(c − 1, a)
c + aB(c − 1, a)
with B(0, a) = 1
3-82
Finite source population: M/M/C/C/K system I
Consider the loss system (no waiting places) in the case where the
arrivals originate from a finite population of sources: the total number
of customers is K
…
…
1
2
1
2
• The time to the next call attempt by a customer, so called thinking
time (idle time) of the customer obeys an exponential distribution
with mean 1/λ (sec)
• Blocked calls are lost
- does not lead to reattempts; starts a new thinking time, again.
The time to the next attempt is also the same exponential
distribution with 1/λ
- the call holding time is exponentially distributed with 1/µ
3-83
M/M/C/C/K system II
If C ≥ K, each customer has its own server, i.e., no blocking.
• Each user shows two-state, active with mean 1/µ and idle with
mean 1/λ
• The probability for a user to be idle or active is
π0 = 1/λ/(1/λ + 1/µ) and π1 = 1/µ/(1/λ + 1/µ),
• Call arrival rate: π0λ, offered load: π1 = a/(1 + a), and a = λ/µ
If C < K, this system can be described as
…
((K − i)λ + iµ)πi = (K − (i − 1))πi−1 + (i + 1)µπi+1
3-84
M/M/C/C/K system III
• For j = 1, 2, . . . , K, we have
(C − j + 1)πj−1 = jµπj ⇒ πj =
K
j
aj
π0.
• Applying
K
j=0 πj = 1,
πj =
K
j
aj
/
C
k=0
K
k
ak
Time blocking (or congestion): the proportion of time the system
spends in the state C; the equilibrium probability of the state C is
PB = πC
– The probability of all resources being busy in a given observational period
– Insensitivity: Like Erlang B formula, this result is insensitive to the form
of the holding time distribution (though the derivation above was explicitly
based on the assumption of exponential holding time distribution)
3-85
M/M/C/C/K system IV
Call blocking: the probability that an arriving call is blocked, i.e., PL
• Arrival rate is state-dependent, i.e., (K − N(t))λ: Not Poisson.
• PASTA does not hold: Time blocking, PB can’t represent PL
• λT : Call arrivals on average
λT ∝
C
i=0
(K − i)λπi
– PL: the probability that a call finds the system blocked
– If λT = 10000 and PL = 0.01, λT PL = 100 calls are lost
• λC : Call arrivals when the system is blocked
λC ∝ (K − C)λ
–PBλC : blocked calls upon the arrival instant
PLλT = PBλC
– Among total arrivals, some of them that find the system blocked
should be equal to call arrivals of seeing the busy system 3-86
M/M/C/C/K system V
• Call blocking PL can be obtained by
PLλT = PBλC → PL =
λC
λT
PB ≤ PB
• Engset formula:
PL(K) =
(K − C)λπC
C
i=0(K − i)λπi
=
(K − C) K!
C!(K−C)! ac
C
i=0(K − i) K!
i!(K−i)! ac
=
(K−1)!
C!(K−1−C)! ac
C
i=0
(K−1)!
i!(K−1−i)! ac
=
K − 1
C
ac
/
C
i=0
K − 1
i
ai
– The state distribution seen by an arriving customer is the same as the
equilibrium distribution in a system with one less customer. It is as if the
arriving customer were an "outside observer"
– PL(K) = PB(K − 1): as K → ∞, PL → PB
3-87
Probability generating function
For a discrete random variable X with gk = Pr[X = k], the PGF is
defined as
G(z) = E[zX
] =
k
zk
gk
where z is a complex variable, and gk = 1
k! dk
G(z)/dzk
at z = 0.
For |z| ≤ 1, G(z) is convergent
G(z) ≤
k
|zk
||gk| ≤
k
gk = 1
G(z) is analytic for |z| < 1, if
• differentiable infinitely often in that domain, or
• G(z) is expressed as a power series, i.e., k zk
gk
3-88
Bulk queues: Bulk arrival I
An arrival of i customers with gi = Pr[bluk size = i]
……
……
… …… …
Global balance equations
λπ0 = µπ1
(λ + µ)πk = µπk+1 +
k−1
i=0
πiλgk−i for k ≥ 1
Using the definition of PGF, i.e., Π(z) =
∞
k=0 zk
πk,
(λ + µ)
∞
k=1
πkzk
=
µ
z
∞
k=1
πk+1zk+1
+
∞
k=1
k−1
i=0
πiλgk−izk
3-89
Bulk queues: Bulk arrival II
The term,
∞
k=1
k−1
i=0 πigk−izk
, can be written as
k = 1, π0g1z
k = 2, (π0g2 + π1g1)z2
k = 3, (π0g3 + π1g2 + π2g1)z3
...
k = i, (π0gi + π1gi−1 + · · · + πi−1g1)zi
which yields π0G(z) + π1zG(z) + π2z2
G(z) + · · · = Π(z)G(z) and
G(z) =
∞
k=1 gkzk
Substituting this into the previous eqn.,
(λ + µ)(Π(z) − π0) =
µ
z
(Π(z) − π0 − zπ1
λ/µπ0=π1
) + λΠ(z)G(z)
3-90
Bulk queues: Bulk arrival III
After some manipulations, we have
Π(z) =
µπ0(1 − z)
µ(1 − z) − λz[1 − G(z)]
=
N(z)
D(z)
To determine π0, we use Π(1) = 1
Π(1) =
N(1)
D(1)
=
0
0
−→
L’Hopitals rule
N (1)
D (1)
= 1,
which yields
π0 = 1 − λG (1)/µ
The mean number of customers in the system
N = Π (1) =
N (z)D(z) − N(z)D (z)
(D(z))2
,
where L’Hopital’s rule should be applied again
3-91
Bulk queues: Bulk service I
Serve a group of size r: some variations are possible
1 20 … … …
λπ0 = µ(π1 + π2 + · · · + πr )
(λ + µ)πk = µπk+r + λπk−1 for k ≥ 1
Using the definition of PGF,
(λ + µ)[Π(z) − π0] =
µ
zr
Π(z) −
r
k=0
πkzk
+ λzΠ(z)
Solving for Π(z), we have
Π(z) =
µ
r
k=0 πkzk
− (λ + µ)π0zr
λzr+1 − (λ + µ)zr + µ
3-92
Bulk queues: Bulk service II
The term, λπ0 = µ(π1 + π2 + · · · + πr ), can be modified as
−zr
(λπ0 + µπ0) = −zr
µ(π0 + π1 + π2 + · · · + πr )
We can rewrite Π(z) as
Π(z) =
r−1
k=0 πk(zk
− zr
)
rρzr+1 − (1 + rρ)zr + 1
with ρ = λ/(µr)
– Can we determine πk for k ∈ {0, 1, . . . , r} using Π(1) = 1?
Rouche’s theorem If f (z) and g(z) are analytic functions of z inside
and on a closed contour C and also if |g(z)| < |f (z)| on C, then f (z)
and f (z) + g(z) have the same number of zeros inside C
• Let f (z) = −(1 + rρ)zr
and g(z) = rρzr+1
+ 1
• On the closed contour C with the origin of radius 1 + δ, |z| = 1 + δ,
|f (z)| = | − (1 + rρ)zr
| = (1 + rρ)(1 + δ)r
|g(z)| = |rρzr+1
+ 1| ≤ rρ(1 + δ)r+1
+ 1
3-93
Bulk queues: Bulk service III
• On the contour C,
|f (z)| − |g(z)| ≥ (1 + rρ)(1 + δ)r
− rρ(1 + δ)r+1
− 1
= (1 + δ)r
(1 − rρδ) − 1
≥ (1 + rδ)(1 − rρδ) − 1 (use (1 + δ)r
≥ 1 + rδ)
= rδ(1 − ρ − rρδ) > 0, 0 < δ < (1 − ρ)/(rρ)
• Letting δ → 0, the denominator has r roots for |z| ≤ 1
• The denominator of degree r + 1 has one additional root for |z| > 1
Since Π(z) is analytic for |z| ≤ 1, r roots in the denominator must be
canceled out in the numerator (Otherwise, Π(z) is not analytic)
– We can rewrite the numerator as
r−1
k=0
πk(zk
− zr
) = K(z − 1)
r−1
k=1
(z − z∗
k )
where K is a proportionality constant and z∗
k is the root inside the
unit disk
3-94
Bulk queues: Bulk service IV
By canceling the roots inside and on the unit disk in the numerator
and denominator, we have
Π(z) =
K(1 − z)
r−1
k=1(z − z∗
k )
rρzr+1 − (1 + rρ)zr + 1
=
K
1 − z/z0
– Using Π(1) = 1, we have K = 1 − 1/z0
– πk = (1 − 1/z0)(1/z0)k
for k = 0, 1, 2 . . .
Our system with bulk service becomes M/M/1 queue if r = 1
• Comparison with M/M/1 queue
πi = (1 − ρ)ρi
for i = 0, 1, 2, . . . → Π(z) =
1 − ρ
1 − ρz
where we find z0 = 1/ρ > 1
3-95
Bulk queues: Bulk service V
Real roots outside the unit disk
z
1 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2 2.1 2.2
D(z)
-2
-1.5
-1
-0.5
0
0.5
1
1.5
2
1/0.75 = 1.333
D(z) = rρzr+1
− (1 + rρ)zr
+ 1, ˆρ = λ/µ, ρ = ˆρ/r
ˆρ = 0.75, r = 1
ˆρ = 0.75, r = 2
ˆρ = 0.75, r = 3
ˆρ = 0.95, r = 3
• For r = 1, we have M/M/1 queue, z0 = 1/˜ρ
• Mean number of customers in the system
L = Π (1) = 1/(z0 − 1)
– As r increases given a fixed ˜ρ, z0 increases → L decreases
– As ˜ρ increases given a fixed r, z0 gets closer to 1 → L increases
3-96
Where are we?
Elementary queueing models
– M/M/1, M/M/C, M/M/C/C/K, ... and bulk queues
– either product-form solutions or use PGF
Intermediate queueing models (product-form solution)
– Time-reversibility of Markov process
– Detailed balance equations of time-reversible MCs
– Multidimensional Birth-death processes
– Network of queues: open- and closed networks
Advanced queueing models
– M/G/1 type queue: Embedded MC and Mean-value analysis
– M/G/1 with vacations and Priority queues
– G/M/m queue
More advanced queueing models (omitted)
– Algorithmic approaches to get steady-state solutions
3-97
Time Reversibility of discrete-time MC I
For an irreducible, aperiodic, discrete-time MC, (Xn, Xn+1,...) having
transition probabilities pij and stationary distribution πi for all i:
Time-reversed MC is defined as X∗
n = Xτ−n for an arbitrary τ > 0
Forward process Time reversed process
1) Transition probabilities of X∗
n
p∗
ij =
πjpji
πi
2) Xn and X∗
n have the same stationary distribution πi with 1) and if
∞
j=0
pij =
∞
j=0
p∗
ij
3-98
Time Reversibility of discrete-time MC II
• Proof for 1) p∗
ij = πjpji/πi:
p∗
ij = Pr[Xm = j|Xm+1 = i, Xm+2 = i2, . . . , Xm+k = ik]
=
Pr[Xm = j, Xm+1 = i, Xm+2 = i2, . . . , Xm+k = ik]
Pr[Xm+1 = i, Xm+2 = i2, . . . , Xm+k = ik]
=
Pr[Xm = j, Xm+1 = i] Pr[Xm+2 = i2, . . . , Xm+k = ik|Xm = j, Xm+1 = i]
Pr[Xm+1 = i] Pr[Xm+2 = i2, . . . , Xm+k = ik|Xm+1 = i]
=
Pr[Xm = j, Xm+1 = i]
Pr[Xm+1 = i]
=
Pr[Xm+1 = i|Xm = j] Pr[Xm = j]
Pr[Xm+1 = i]
=
pjiπj
πi
• Proof for 2) Using the above result,
i∈S
πip∗
ij =
i∈S
πi(πjpji/πi) = πj
3-99
Time Reversibility of discrete-time MC III
A Markov process, Xn, is said to be reversible, if
– the transition probabilities of the forward and reversed chains are
the same,
p∗
ij = Pr[Xm = j|Xm+1 = i] = pij = Pr[Xm+1 = j|Xm = i]
• Time reversibility ⇔ Detailed balanced equations (DBEs) hold
πip∗
ij = πjpji → πipij = πjpji (detailed balance eq.)
What types of Markov processes satisfy this detailed balance
equation? discrete-time Birth-death (BD) process
• Transition occurs between neighboring states: pij = 0 for |i −j| > 1
0 1 2 … …
3-100
Time Reversibility of discrete-time MC IV
A transmitter’s queue with stop-and-wait ARQ (θ = qr) in Mid-term I
• Is this process reversible?
0 1 2
… …… …
• Global balance equations (GBEs)
π0 =(1 − p)π0 + (1 − p)θπ1
π1 =pπ0 + (pθ + (1 − p)(1 − θ))π1 + (1 − p)θπ2
For i = 2, 3, . . ., we have
πi =p(1 − θ)πi−1 + (pθ + (1 − p)(1 − θ))πi + (1 − p)θπi+1
• Instead, we can use DBEs, or simplify GBEs using DBEs, e.g.,
p(1 − θ)πi = (1 − p)θπi+1 ↔
n
j=0
∞
i=n+1
πjpji =
n
j=0
∞
i=n+1
πipij
3-101
Time Reversibility of discrete-time MC V
Kolmogorov Criteria
• A discrete-time Markov chain is reversible if and only if
pi1i2
pi2i3
· · · pin−1in
pini1
= pi1in
pinin−1
· · · pi3i2
pi2i1
for any finite sequence of states, i1, i2,. . . ,in and any n
Proof:
• For a reversible chain, if detailed balance eqns. hold, we have
10
3 2
• Fixing two states, i1 = i, and in = j and multiplying over all states,
pi,i2
pi2i3
· · · pin−1jpji = pijpjin−1
· · · pi3i2
pi2i
3-102
Time Reversibility of discrete-time MC VI
• From the Kolmogorov criteria, we can get
pi,i2
pi2i3
· · · pin−1jpji = pijpjin−1
· · · pi3i2
pi2i
p
(n−1)
ij pji = pijp
(n−1)
ji
As n → ∞, we have
lim
n→∞
p
(n−1)
ij pji = lim
n→∞
pijp
(n−1)
ji → πjpji = πipij
Inspect whether the following two-state MC is reversible
P =
0 1
0.5 0.5
– It is a small BD process
– Using state probabilities, π0 = 1/3 and π1 = 2/3,
π0p01 =
1
3
· 1 = π1p10 =
2
3
·
1
2
3-103
Time Reversibility of discrete-time MC VII
Inspect whether the following three-state MC is reversible
P =


0 0.6 0.4
0.1 0.8 0.1
0.5 0 0.5


• Using Kolmogorov criteria,
p12p23p31 = 0.6 × 0.1 × 0.5 = p13p32p21 = 0.4 × 0 × 0.1 = 0
• Inspecting state transition diagram, it is not a BD process
If the state transition diagram of a Markov process is a tree, then the
process is time reversible
– A generalization of BD processes: at the cut boundary, DBE is
satisfied 3-104
Continuous-time reversible MC I
For a continuous-time MC, X(t), whose stationary state probability
πi, we have a discrete-time embedded Markov chain whose stationary
pmf and a state transition probability are πi and ˜pij.
Embedded Markov process
Forward Process Reverse Process
time
There is a reversed embedded MC with πi˜pij = πj˜p∗
ji for all i = j.
0 1 32 4
0 1 32 4
… …
… …
CTMC
Embedded MC (BD process)
3-105
Continuous-time reversible MC II
Recall the state occupancy time of the forward process
Pr[Ti > t + s|Ti > t] = Pr[Ti > s] = e−vi s
If X(t) = i, the probability that the reversed process remains in state
i for an additional s seconds is
Pr[X(t ) = i, t − s ≤ t ≤ t|X(t) = i] = e−vi s
Embedded Markov process
Forward Process Reverse Process
time
3-106
Continuous-time reversible MC III
A continuous-time MC whose stationary probability of state i is θi,
and state transition rate from j to i is γji has a reversible MC whose
state transition rate is γ∗
ij, if we find γ∗
ij of satisfying
γ∗
ij = vi˜p∗
ij = vi
πj˜pji
πi ˜pji =γji /vj
= vi
πjγji
πivj
from embedded MC
= θjγji/θi
– ˜p∗
ij(= ˜pij): state transition probability of the reversed embedded MC
– Continuous-time MC whose state occupancy times are exponentially
distributed is reversible if its embedded MC is reversible
Additionally, we have vj = v∗
j
i=j
θiγ∗
ij γ∗
ij =θj γji /θi
= θj
i=j
γji = θjvj = θjv∗
j ⇒
j=i
γij =
j=i
γ∗
ij
3-107
Continuous-time reversible MC IV
Detailed balance equation holds for continuous-time reversible MCs
θjγji (input rate to i) = θiγij (output rate from i) for j = i + 1
– Birth-death systems with γij = 0 for |i − j| > 1
– Since the embedded MC is reversible,
πi˜pij = πj˜pji → (viθi/c)˜pij = (vjθj/c)˜pji → θiγij = θjγji
If there exists a set of positive numbers θi, that sum up to 1 and
satisfy
θiγij = θjγji for i = j
then, the MC is reversible and θi is the unique stationary distribution
– Birth and death processes, e.g., M/M/1, M/M/c, M/M/∞
Kolmogorov criteria for continuous time MC
– A continuous-time Markov chain is reversible if and only if
γi1i2
γi2i3
· · · γini1
= γi1in
γinin−1
· · · γi3i2
γi2i1
– Proof is the same as in the discrete-time reversible MC
3-108
M/M/2 queue with heterogeneous servers I
Servers A and B with service rates µA and µB. When the system
empty, arrivals go to A with probability p and to B with probability
1 − p. Otherwise, the head of the queue takes the first free server
0
1A
2B
2 3
Under what condition is this system time-reversible?
• For n = 2, 3, . . .,
πn = π2 (λ/(µA + µA))
n−2
• Global balance equations along the cuts
λπ0 = µAπ1,A + µBπ1,B
(µA + µB)π2 = λ(π1,A + π1,B)
(µA + λ)π1,A = pλπ0 + µBπ2
3-109
M/M/2 queue with heterogeneous servers II
After some manipulations,
π1,A = π0
λ
µA
λ + p(µA + µB)
2λ + µA + µB
π2,A = π0
λ
µB
λ + (1 − p)(µA + µB)
2λ + µA + µB
π2 = π0
λ2
µAµB
λ + (1 − p)µA + pµB
2λ + µA + µB
π0 can be determined by π0 + π1,A + π2,B +
∞
n=2 πn = 1
• If it is reversible, use detailed balance equations
(1/2)λπ0 = µAπ1,A → π1,A = 0.5(λ/µA)π0
(1/2)λπ0 = µBπ1,B → π1,B = 0.5(λ/µB)π0
π2 =
0.5λ2
µAµB
π0
3-110
Multidimensional Markov chains I
Suppose that X1(t) and X2(t) are independent reversible MCs
• Then, X(t) = (X1(t), X2(t)) is a reversible MC
• Two independent M/M/1 queue, where arrival and service rates at
queue i are λi and µi
– (N1(t), N2(t)) forms an MC
Example: Two Independent M/M/1 Queues
Stationary distribution:
Detailed Balance Equations:
Verify that the Markov chain is
reversible – Kolmogorov criterion
1 2
1 1 2 2
1 2
1 1 2 2
( , ) 1 1
n n
p n n
λ λ λ λ
µ µ µ µ
     
= − −     
     
1 1 2 1 1 2
2 1 2 2 1 2
( 1, ) ( , )
( , 1) ( , )
p n n p n n
p n n p n n
µ λ
µ λ
+ =
+ =
2λ 2µ2λ 2µ 2λ 2µ 2λ 2µ
2λ 2µ2λ 2µ 2λ 2µ 2λ 2µ
2λ 2µ2λ 2µ 2λ 2µ 2λ 2µ
02 12 22 32
1λ 1λ 1λ
1µ 1µ 1µ
01 11 21 31
1λ 1λ 1λ
1µ 1µ 1µ
00 10 20 30
1λ 1λ 1λ
1µ 1µ 1µ
03 13 23 33
1λ 1λ 1λ
1µ 1µ 1µ
– Is this a reversible MC?
3-111
Multidimensional Markov chains II
– Owing to time-reversibility, detailed balance equations hold
µ1π(n1 + 1, n2) = λ1π(n1, n2)
µ2π(n1, n2 + 1) = λ2π(n1, n2)
– Stationary state distribution
π(n1, n2) = 1 −
λ1
µ1
λ1
µ1
n1
1 −
λ2
µ2
λ2
µ2
n2
• Can be generalized for any number of independent queues, e.g.,
M/M/1, M/M/c or M/M/∞
π(n1, n2, . . . , nK ) = π1(n1)π2(n2) · · · πK (nK )
– ’Product form’ distribution
3-112
Truncation of a Reversible Markov chain I
X(t) is a reversible Markov process with state space S and stationary
distribution, πj for j ∈ S.
– Truncated to a set E ⊂ S such that the resulting chain Y (t) is
irreducible. Then, Y (t) is reversible and has the stationary
distribution
ˆπj =
πj
k∈E πk
j ∈ E
– This is the conditional prob. that. in steady state, the original
process is at state j, given that it is somewhere in E
Proof:
ˆπjqji = ˆπiqij ⇒
πj
k∈E πk
ˆπj
qji =
πi
k∈E πk
qij ⇒ πjqji = πiqij
k∈E
ˆπk =
j∈E
πj
k∈E πk
= 1
3-113
Truncation of a Reversible Markov chain II
Markov processes for M/M/1 and M/M/C are reversible
• State probabilities of M/M/1/K queue
πi =
(1 − ρ)ρi
K
i=0(1 − ρ)ρi
=
(1 − ρ)ρi
1 − ρK+1
for ρ =
λ
µ
– Truncated version of M/M/1/∞ queue
• State probabilities of M/M/c/c queue
– M/M/c/∞ queue with ρ = λ/(mµ) and a = λ/µ
πn = ρmax(0,n−c) ac
n!
π0
– Truncated version of M/M/c/∞ queue
ˆπn = πn/
c
n=0
πn =
an
n!
/
c
i=0
ai
i!
3-114
Truncation of a Reversible Markov chain III
Two independent M/M/1 queues of the previous example share a
common buffer of size B (=2)
• An arriving customer who finds B customers waiting is blocked
Example: Two Queues with Joint Buffer
The two independent M/M/1 queues of
the previous example share a common
buffer of size B – arrival that finds B
customers waiting is blocked
State space restricted to
Distribution of truncated chain:
Normalizing:
Theorem specifies joint distribution up
to the normalization constant
Calculation of normalization constant is
often tedious
2λ 2µ
2λ 2µ
2λ 2µ
1λ
1µ
2λ 2µ
2λ 2µ
2λ 2µ
1λ
1µ
1λ
1µ
1λ
1µ
2λ 2µ
2λ 2µ 2λ 2µ
02 12
1λ
1µ
01 11 21
1λ 1λ
1µ 1µ
00 10 20 30
1λ 1λ
1µ 1µ
03 13
22
31
1 2 1 2{( , ) :( 1) ( 1) }E n n n n B+ +
= − + − ≤
1 2
1 2 1 2 1 2( , ) (0,0) , ( , )n n
p n n p n n Eρ ρ= ⋅ ∈
1 2
1 2
1
1 2
( , )
(0,0) n n
n n E
p ρ ρ
−
∈
 
=  
 
∑
State diagram for B =2• State space: E = {(n1, n2) : (n1 − 1)+
+ (n2 − 1)+
≤ B}
• Stationary state distribution of the truncated MC
π(n1, n2) = π(0, 0)ρn1
1 ρn2
2 for (n1, n2) ∈ E
• π(0, 0) is obtained by π(0, 0) = 1/ (n1,n2)∈E ρn1
1 ρn2
2
3-115
Truncation of a Reversible Markov chain IV
Two session classes in a circuit switching system with preferential
treatment for one class for a total of C channels
• Type 1: Poisson arrivals with λ1 require exponentially distributed
service rate µ1 – admissible only up to K
• Type 2: Poisson arrivals with λ2 require exponentially distributed
service rate µ2 – can be accepted until C channels are used up
S = {(n1, n2)|0 ≤ n1 ≤ K, n1 + n2 ≤ C}374 IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, VOL. 51, NO. 2, MARCH 2002
Fig. 2. Transition diagram for the new call bounding scheme.
handoff calls in the cell. Let and . From
the detailed balance equation, we obtain
From this, the traffic intensities for new calls and handoff calls
using the above common average channel holding time 1
3-116
Truncation of a Reversible Markov chain V
• The state probabilities can be obtained as
P(n1, n2) =
ρn1
1
n1!
ρn2
2
n2!
P(0, 0) for 0 ≤ n1 ≤ K, n1+n2 ≤ C, n2 ≥ 0
– P(0, 0) can be determined by n1,n2
P(n1, n2) = 1
• Blocking probability of type 1
Pb1 =
C−K
n2=0
ρK
1
K! ·
ρ
n2
2
n2! +
K−1
n1=0
ρ
n1
1
n1! ·
ρ
C−n1
2
(C−n1)!
K
n1=0
ρ
n1
1
n1!
C−n1
n2=0
ρ
n2
2
n2!
• Blocking probability of type 2
Pb2 =
K
n1=0
ρ
n1
1
n1! ·
ρ
C−n2
2
(C−n2)!
K
n1=0
ρ
n1
1
n1!
C−n1
n2=0
ρ
n2
2
n2!
For this kind of systems, blocking probabilities are valid for a broad
class of holding time distributions
3-117
Networks of queues
Two queues in tandem (BG, p.210)
• Assume that service time is proportional to the packet length
Queue 1
Queue 1 Queue 2
Queue 2 is empty when arrives
Queue 2
time
Arrivals at queue 2 get bursty
– Interarrival times at the second queue are strongly correlated with
the packet length at the first queue or the service time!
• The first queue is an M/M/1, but the second queue cannot be
considered as an M/M/1
3-118
Kleinrock’s Independence Approximation I
In real networks, many queues interact with each other
– a traffic stream departing from one or more queues enters one or
more other queues, even after merging with other streams departing
from yet other queues
• Packet interarrival times are correlated with packet lengths.
• Service times at various queue are not independent, e.g.,
state-dependent flow control.
Kleinrock’s independence approximation:
• M/M/1 queueing model works for each link: merging several
packet streams on a transmission line makes interarrival times and
packet lengths independent
• Good approximation when:
* Poisson arrivals at entry points of the network
* Packet transmission times ‘nearly’ exponential
* Several packet streams merged on each link
* Densely connected network and moderate to heavy traffic load
3-119
Kleinrock’s Independence Approximation II
Suppose several packet streams, each following a unique path through
the network: appropriate for virtual circuit network, e.g., ATM
• xs: arrival rate of packet stream s
• fij(s): the fraction of the packets of stream s through link (i, j)
• Total arrival rate at link (i, j)
λij =
all packet streams s
crossing link (i, j)
fij(s)xs
3-120
Kleinrock’s Independence Approximation III
Based on M/M/1 (with Kleinrock’s Independence approximation), #
of packets in queue or service at (i, j) on average is
Nij =
λij
µij − λij
– 1/µij is the average packet transmission time on link (i, j)
• The average number of packets over all queues and the average
delay per packet are
N =
(i,j)
Nij and T =
1
γ
(i,j)
Nij
– γ = s xs: total arrival rate in the system
• As a generalization with proc. & propag. delay,
Tp =
all packet streams s
crossing link (i, j)





λij
µij(µij − λij)
queueing delay
+
1
µij
+ dij





3-121
Kleinrock’s Independence Approximation IV
In datagram networks including multiple path routing for some
origin-destination pairs, M/M/1 approx. often fails
• Node A sends traffic to node B along two links with service rate µSec. 3.6 Networks of Transmission Lines
V2
V2
B
213
Figure 3.29 Poisson process with rate .
divided among two links. If division is
done by randomization, each link behaves
like an M IJIII queue. If division is done
by metering, the whole system behaves Iike
an 1111'v112 queue.
(3.103)
(3.104)
where! = L8.rS is the total arrival rate in the system. If the average processing and
propagation delay elij at link (i. j) is not negligible, this formula should be adjusted to
I L ( Ai) )T = - . + Aij elij
11" - A"r (i.j) I) 1)
Finally, the average delay per packet of a traffic stream traversing a path p is given by
'"' (Ai) 1 )Tp = L .. .. _ .. + -. + eli)
'. ILI)(p,) AI)) 11i)all (I,))
on path p
where the three terms in the sum above represent average waiting time in queue, average
transmission time, and processing and propagation delay, respectively.
In many networks, the assumption of exponentially distributed packet lengths is not
appropriate. Given a different type of probability distribution of the packet lengths, one
may keep the approximation of independence between queues but use the P-K formula for
average number in the system in place of the AI/M/1 formula (3.100). Equations (3.101)
to (3.104) for average delay would then be modified in an obvious way.
For virtual circuit networks (cf. Fig. 3.27), the main approximation involved in the
II/ M /1 formula (3.101) is due to the correlation of the packet lengths and the packet
interarrival times at the various queues in the network. If somehow this correlation was
not present (e.g., if a packet upon departure from a transmission line was assigned a new
length drawn from an exponential distribution), then the average number of packets in
the system would be given indeed by the formula
This fact (by no means obvious) is a consequence of Jackson's Theorem, which will be
discussed in Section 3.8.
In datagram networks that involve multiple path routing for some origin-destination
pairs (cf. Fig. 3.28), the accuracy of the M / M /1 approximation deteriorates for another
reason, which is best illustrated by an example.
Example 3.17
Suppose that node A sends traffic to node B along two links with service rate 11 in the
network of Fig. 3.29. Packets arrive at A according to a Poisson process with rate .
packets/sec. Packet transmission times are exponentially distributed and independent of
interarrival times as in the AI/M / I system. Assume that the arriving traffic is to be
divided equally among the two links. However, how should this division be implemented?
Consider the following possibilities.
– Random splitting: queue at each link may behave like an M/M/1
TR =
1
µ − λ/2
– Metering: arriving packets are assigned to a queue with the smallest
backlog → approximated as an M/M/2 with a common queue
TM =
2
(2µ − λ)(1 + ρ)
< TR
∗ Metering destroys an M/M/1 approximation
3-122
Burke’s theorem I
For M/M/1, M/M/c, M/M/∞ with arrival rate λ (without bulk
arrivals and service):
B1. Departure process is Poisson with rate λ.
Forward process Reverse process
Reverse process
Arrivals of forward process Departures of reverse process
• The arrival process in the forward process corresponds to the
departure process in the reverse process
• Since the arrivals in forward time form a Poisson process, the
departures in backward time form a Poisson process
• Since the backward process is statistically the same as the forward
process, the (forward) departure process is Poisson
3-123
Burke’s theorem II
B2. The state (packets in system) left by a forward departure in the
forward process is independent of the past departures.
Time direction in the forward process
Time direction in the reverse process
Departure prior to in the forward process
Arrival after in the reverse process
: Future arrivals does not depend on the
current number in the system
– In the reverse process, the state is independent of future arrivals.
3-124
Two M/M/1 Queues in Tandem
The service times of a customer at the first and the second queues are
mutually independent as well as independent of the arrival process.
Queue 1 Queue 2
• Based on Burke’s theorem B1, queue 2 in isolation is an M/M/1
– Pr[m at queue 2] = ρm
2 (1 − ρ2)
• B2: # of customers presently in queue 1 is independent of the
sequence of departure time prior to t (earlier arrivals at queue 2)
– independent of # of customers presently in queue 2
Pr[n at queue 1 and m at queue 2]
= Pr[n at queue 1] Pr[m at queue 2] = ρn
1 (1 − ρ1)ρm
2 (1 − ρ2)
3-125
Open queueing networks
Consider a network of K first-come first serve, single server queue,
each of which has unlimited queue size and exponential distribution
with rate µk.
External arrivals
routing path
• Traffic equation with routing probability pij or matrix P = [pij]
λi = αi +
K
j=1
λjpji and
K
i=0
pji = 1
– pi0: flow going to the outside
– λi can be uniquely determined by solving
λ = α + λP ⇒ λ = α(I − P)−1 3-126
Open queueing networks II
Let n = (n1, . . . , nK ) denote a state (row) vector of the network.
The limiting queue length distribution π(n)
π(n) = lim
t→∞
Pr[X1(t) = n1, . . . , XK (t) = nK ]
Global balance equation (GBE): total rate out of n = total rate into n
α +
K
i=1
µi π(n) =
K
i=1
αiπ(n − ei)
external arrivals
+
K
i=1
pi0µiπ(n + ei)
go outside from i
+
K
i=1
K
j=1
pjiµjπ(n + ej − ei)
from j to i
– ei = (0, . . . , 1, . . . , 0), i.e., the 1 is in the ith
position
– π(n − ei) denotes π(n1, n2, . . . , ni − 1, . . . , nK )
3-127
Jackson’s theorem I
Using time-reversibility, guess detailed balance equations (DBEs) as
λiπ(n − ei) = µiπ(n), λiπ(n) = µiπ(n + ei)
and λjπ(n − ei) = µjπ(n + ej − ei) based on
Substituting DBEs into GBE gives us
RHS =π(n)
K
i=1
αiµi
λi
+
K
i=1
pi0λi +
K
i=1
K
j=1 pjiλj
λi
µi
=π(n)
K
i=1
pi0λi
α
+
K
i=1
αi +
K
j=1 pjiλj
λi
µi
– in the numerator: λi = αi +
K
j=1 pjiλj
3-128
Jackson’s theorem II
From DBEs, we have
π(n1, . . . , ni, . . . , nK ) =
λi
µi
π(n1, . . . , ni − 1, . . . , nK )
and
π(n1, . . . , ni − 1, . . . , nK ) =
λi
µi
π(n1, . . . , ni − 2, . . . , nK )
which is finally rearranged as
π(n1, . . . , ni, . . . , nK ) =
λi
µi
ni
π(n1, . . . , 0, . . . , nK )
Repeating for i = 1, 2, . . . , K,
π(n) = π(0)
K
i=1
λi
µi
ni
– π(0) =
K
i=1
∞
ni =0 ρni
i and ρi = λi/µi
3-129
Jackson’s theorem: proof of DBEs I
Proving DBEs based on time-reversibility
• Construct a routing matrix, P∗
= [p∗
ij], of the reversed process
• The rate from node i to j must be the same in the forward and
reverse direction,
(forward process) λipij = λjp∗
ji (reverse process)
– λjp∗
ji: the output rate from server j is λj, and p∗
ji is the rate of
moving from j to i; α∗
i = λipi0; p∗
i0 = αi/λi
We need to show (recall θiγij = θjγ∗
ji)
π(n)vn,m = π(m)v∗
m,n and
m
vn,m =
m
v∗
n,m
– vn,m and v∗
n,m denote state transition rate of the forward and
reversed process
3-130
Jackson’s theorem: proof of DBEs II
We need to consider the following three cases
• Arrival to server i outside the network in the forward process
corresponds to a departure out of the network from server i in the
reversed process
π(n)vn,n+ei
= π(n + ei)v∗
n+ei ,n
• Departure to the outside in the forward process corresponds to
arrival from the outside in the reversed process,
π(n)vn,n−ei = π(n − ei)v∗
n−ei ,n
• Leaving queue i and joining queue j in the forward process
(vn,n−ei +ej
= µipij) correspond to leaving queue j and joining
queue i in the reversed process (v∗
n−ei +ej ,n = µjp∗
ji = λipijµj/λj)
π(n)vn,n−ei +ej
= π(n − ei + ej)v∗
n−ei +ej ,n
3-131
Jackson’s theorem: proof of DBEs III
1) π(n)vn,n+ei = π(n + ei)v∗
n+ei ,n:
Arrival to server i outside the network in the forward process corresponds to
a departure out of the network from server i in the reversed process, i.e.,
v∗
n+ei ,n =µi 1 −
K
j=1
p∗
ij
p∗
i0
Use p∗
ij =λj pji /λi
= µi 1 −
K
j=1
λjpji
λi
=
µi
λi
λi −
K
j=1
λjpji
λi =αi +
K
j=1
λj pji
= αi/ρi (= v∗
n,n−ei
).
Substituting this into 1) (vn,n+ei = αi: arrival to server i from outside)
K
i=1
πi(ni)αi = πi(ni + 1)
K
j=1,j=i
πj(nj)αi/ρi
3-132
Jackson’s theorem: proof of DBEs IV
Rearranging the previous eqn. yields
πi(ni)αi
K
j=1,j=i
πj(nj) = πi(ni + 1)(αi/ρi)
K
j=1,j=i
πj(nj)
After canceling, we have
πi(ni + 1) = ρiπi(ni) ⇒ πi(n) = ρn
i (1 − ρi)
2) π(n)vn,n−ei
= π(n − ei)v∗
n−ei ,n: Departure to the outside in the forward
process corresponds to arrival from the outside in the reversed process,
v∗
n−ei ,n = αi = λi −
K
j=1
λjp∗
ji
Traffic eqn. for the reversed process
= λi −
K
j=1
λj
λipij
λj
=λi 1 −
K
j=1
pij = λipi0 (= v∗
n,n+ei
).
3-133
Jackson’s theorem: proof of DBEs V
Substituting this with vn,n−ei
= µipi0 (departure to the outside),
(1 − ρi)ρni
i
K
k=1,k=i
πk(nk)µipi0 = (1 − ρi)ρni −1
i
K
k=1,k=i
πk(nk)λipi0
3) π(n)vn,n−ei +ej = π(n − ei + ej)v∗
n−ei +ej ,n: Leaving queue i and
joining queue j in the forward process (vn,n−ei +ej = µipij) correspond to
leaving queue j and joining queue i in the reversed process, i.e,
v∗
n−ei +ej ,n = µjp∗
ji = λipijµj/λj,
(1 − ρi)ρni
i (1 − ρj)ρ
nj
j
K
k=1,k=i,j
πk(nk)µipij
= (1 − ρi)ρni −1
i (1 − ρj)ρ
nj +1
j
K
k=1,k=i,j
πk(nk)µjp∗
ji
use p∗
ji =λi pij /λj
3-134
Jackson’s theorem: proof of DBEs VI
Summary of transition rates of forward and reverse processes
Transition Forward vn,m Reverse v∗
n,m Comment
n → n + ei αi λi(1 −
K
j=1
pij) all i
n → n − ei µi(1 −
K
j=1
pij) αiµi/λi all i: ni > 0
n → n − ei + ej µipij λjpjiµi/λi all i: ni > 0, all j
4) Finally, we verify total rate equation, m
vn,m = m
v∗
n,m:
v∗
n,m =
i
λi 1 −
K
j=1
pij
i
λi −
i j
λi pij
+
i:ni >0
αiµi/λi +
j
λjpjiµi/λi
=
i
λi −
j
(λj − αj)
λj =αj +
K
i=1
λi pij
+
i:ni >0
αiµi
λi
+
µi
λi
(λi − αi)
=
i
αi +
i:ni >0
µi = vn,m.
3-135
Open queueing networks: Extension I
The product-form solution of Jackson’s theorem is valid for the
following network of queues
• State-dependent service rate
– 1/µi(ni): the mean of queue i’s service time exponentially
distributed, when ni is the number of customers in the ith
queue
just before the customer’s departure
ρi(ni) =
λi
µi(ni)
, i = 1, . . . , K, ni = 1, 2, . . .
– λi: total arrival rate at queue i determined by the traffic eqn.
– Define ˆPj(nj) as
γij =
1, if nj = 0,
ρj(1)ρj(2) · · · ρj(nj), if nj > 0
3-136
Open queueing networks: Extension II
– For all state n = (n1, . . . , nK )
P(n) =
ˆP1(n1)ˆP2(n2) · · · ˆPk(nK )
G
,
where G =
∞
n1=0 · · ·
∞
nK =0
ˆP1(n1) · · · ˆPk(nK )
• Multiple classes of customers
– Provided that the service time distribution at each queue is the
same for all customer classes, the product form solution is valid for
the system with different classes of customers, i.e.,
λj(c) = αj(c) +
K
i=1
λi(c)pij(c)
– αj(c): rate of the external arrival of class c at queue j; pij(c) the
routing probabilities of class c – See pp.230-231 in the textbook for
more details
3-137
Open queueing networks: Performance measure
Performance measure
• State probability distribution has been derived
• Mean # of hops traversed, h, is
h =
λ
α
=
K
i=1 λi
K
i=1 αi
• Throughput of queue i: λi
• Total throughput of the queueing network: α
• Mean number of customers at queue i (ρi = λi/µi)
Ni = ρi/(1 − ρi)
• System response time T
T =
N
α
=
1
α
K
i=1
Ni =
1
α
K
i=1
λiTi =
1
α
K
i=1
λi
µi − λi
3-138
Open queueing networks: example A-I
New programs arrive at a CPU according to a Poisson process of rate α. A
program spends an exponentially distributed execution time of mean 1/µ1
in the CPU. At the end of this service time, the program execution is
complete with probability p or it requires retrieving additional information
from secondary storage with probability 1 − p. Suppose that the retrieval of
information from secondary storage requires an exponentially distributed
amount of time with mean 1/µ2. Find the mean time that each program
spends in the system.
3-139
Open queueing networks: example A-II
Find the mean arrival rate,
• Arrival rate into each queue, λ1 = α + λ2 and λ2 = (1 − p)λ1
λ1 = α/p and λ2 = (1 − p)α/p
• Each queue behaves like an M/M/1 system, so
E[N1] =
ρ1
1 − ρ1
and E[N2] =
ρ2
1 − ρ2
where ρ1 = λ1/µ1 and ρ2 = λ2/µ2
Using Little’s result, the total time spent in the system
E[T] =
E[N1 + N2]
α
=
1
α
ρ1
1 − ρ1
+
ρ2
1 − ρ2
3-140
Open queueing networks: example B-I
Consider the following network with three nodes
N =
M
i=1
Ni =
M
i=1
ρi
1 − ρi
; T =
N
γ
=
M
i=1
(
λi
γ
)Ti
eg. Consider the following networks with three routers
A
B
CγA
γB
γC
L1 L3
L2
L4
• External packet arrivals : Poisson
process with γA = 3.5 pack-
ets/sec, γB = 1.5, γC = 1.5.
• Packet length : exponentially dis-
tributed with mean 1000 bits/packet.
• External packet arrivals :
Poisson process with γA = 350
(packets/sec), γB = 150,
γC = 150.
• Packet length : exponentially
distributed with mean 50
(kbits/packet)
Assumptions:
(a) Packets moving along a path from source to destination have their
lengths selected independently at each outgoing link
→ Kleinrock’s independence assumption
(b) Channel capacity of link i : Ci= 17.5 Mbps for i = 1, 2, 3, 4
→ Service rate at link i: exponentially distributed with rate
µi = Ci/50000 = 350 packets/sec.
3-141
Open queueing networks: example B-II
• Traffic matrix (packets per second)
from → to A B C
A – 150 200
(50% through B)
(50% directly to C)
B 50 – 100
C 100 50 –
• Find mean delay from A to C
• First, we need to know link traffic
traffic type L1 L2 L3 L4
A → B 150
A → C 100 100 100
B → A 50 50
B → C 100
C → A 100
C → B 50 50
total λ1 = 300 λ2 = 100 λ3 = 250 λ4 = 200
3-142
Open queueing networks: example B-II
• Since α = 650 and λ = 850, the mean number of hops is
h = 850/650 = 1.3077
• We get link utilization, mean number and response time as
L1 L2 L3 L4
ρi 300/350=0.857 100/350=0.286 250/350=0.714 200/350 =0.572
Ni 300/50 100/250 250/100 200/150
Ti 100/50=2 100/250=0.4 100/100=1 100/150=0.667
– Ni = ρi/(1 − ρi) and Ti = λiNi
• Mean delay from A to C
TAC = ( T1
A to B
+ T2
B to C
) × 0.5 + T3 × 0.5 = 1.7 (sec)
– propagation delay is ignored
3-143
Closed queueing networks I
Consider a network of K first-come first serve, single server queue,
each of which has unlimited queue size and exponential distribution
with rate µk. There are also a fixed number of customers, say M,
circulate endlessly in a closed network of queues.
• Traffic eqn.: no external arrival!
λi =
K
j=1
λjpji with
K
i=0
pji = 1
3-144
Closed queueing networks II
• Using π = π · P, and π · 1 = 1, we have
λi = λ(M)πi
– λ(M): a constant of proportionality, the sum of the arrival rates
in all the queues in the network
–
K
i=1 λi = 1
Assuming ρi = λi/µi < 1 for i = 1, . . . , K, we have for all ni ≥ 0,
π(n) =
1
G(M)
K
i=1
ρni
, and G(M) =
n1+···+nK =M
K
i=1
ρni
i
• ρi is no longer the actual utilization due to λ(M)
• Setting λ(M) to a value does not change the results
• Since there are M customers, the maximum queue size of each
queue is M
3-145
Closed queueing networks III
Proof: as in Jackson’s theorem for open queueing networks
• Use time-reversibility: routing matrix of the reversed process
• For state transition between n and n = n − ei + ej
π(n )v∗
n ,n = π(n)vn,n (∗)
• As in open queueing networks, we have
v∗
n−ei +ej ,n =µjp∗
ji = µj(λipij/λj)
vn,n−ei +ej
=µipij for ni > 0
• Substituting these into (*), we have
ρiπ(n1, . . . , ni − 1, . . . , nj + 1, . . . , nK ) = ρjπ(n1, . . . , nK )
• The proof for the following is given on page 235
m
vn,m =
m
v∗
n,m
3-146
Closed queueing networks IV
Computing G(M, K) with M customers and K queues iteratively
G(m, k) = G(m, k − 1) + ρkG(m − 1, k)
with boundary conditions: G(m, 1) = ρm
1 for m = 0, 1, . . . , M, and
G(0, k) = 1 for k = 1, 2, · · · , K
• For m > 0 and k > 1, split the sum into two disjoint sums as
G(m, k) =
n1+···+nk =m
ρn1
1 ρn2
2 · · · ρnk
k
=
n1+···+nk =m,
nk =0
ρn1
1 ρn2
2 · · · ρnk
k +
n1+···+nk =m,
nk >0
ρn1
1 ρn2
2 · · · ρnk
k
=
n1+···+nk =m,
nk =0
ρn1
1 ρn2
2 · · · ρ
nk−1
k−1
G(m,k−1)
+
n1+···+nk =m,
nk >0
ρn1
1 ρn2
2 · · · ρnk
k
3-147
Closed queueing networks V
• Since nk > 0, we change nk = nk + 1 for nk ≥ 0
n1+···+nk =m,
nk >0
ρn1
1 ρn2
2 · · · ρnk
k =
n1+···+nk +1=m,
nk >0
ρn1
1 ρn2
2 · · · ρ
nk +1
k
=ρk
n1+···+nk =m−1,
nk >0
ρn1
1 ρn2
2 · · · ρ
nk
k
=ρkG(m − 1, k)
In a closed Jackson network with M customers, the probability that
at steady-state, the number of customers in station j greater than or
equal to m is
Pr[xj ≥ m] = ρm
j
G(M − m)
G(M)
for 0 ≤ m ≤ M
3-148
Closed queueing networks VI
• Proof: nj = nj + m for nj ≥ 0
Pr[xj ≥ m] =
n1+·+nj +···+nK =M,
nj ≥m
ρn1
1 · · · ρ
nj
j · · · ρnK
K
G(M)
=
n1+·+nj +m+···+nK =M,
nj +m≥m
ρn1
1 · · · ρ
nj +m
j · · · ρnK
K
G(M)
=
ρm
j
G(M)
n1+·+nj +···+nK =M−m,
nj ≥m
ρn1
1 · · · ρ
nj
j · · · ρnK
K
=
ρm
j
G(M)
G(M − m)
• Pr[xj = m] = Pr[xj ≥ m] − Pr[xj ≥ m + 1]
= ρm
j (G(M − m) − ρjG(M − m − 1))/G(M)
3-149
Closed queueing networks VII
In a closed Jackson network with M customers, the average number
of customers at queue j:
Nj(M) =
M
m=1
Pr[xj ≥ m] =
M
m=1
ρm
j
G(M − m)
G(M)
In a closed Jackson network with M customers, the average
throughput of queue j:
γj(M) =µj Pr[xj ≥ 1] = µjρj
G(M − 1)
G(M)
=λj
G(M − 1)
G(M)
– Average throughput is the average rate at which customers are
serviced in the queue. For a single-server queue the service rate is µj
when there are one or more customers in the queue, and 0 when the
queue is empty
3-150
Closed queueing networks: example I
Suppose that the computer system given in the open queueing network is
now operated so that there are always I programs in the system. Note that
the feedback loop around the CPU signifies the completion of one job and
its instantaneous replacement by another one. Find the steady state pmf of
the system. Find the rate at which programs are completed.
• Using λi = λ(I)πi with π = πP,
π1 = pπ1 + π2, π2 = (1 − p)π1 and π1 + π2 = 1
we have
λ1 = λ(I)π1 =
λ(I)
2 − p
and λ2 = λ(I)π2 =
λ(I)(1 − p)
2 − p 3-151
Closed queueing networks: example II
• For 0 ≤ i ≤ I, ρ1 = λ1/µ1 and ρ2 = λ2/µ2
Pr[N1 = i, N2 = I − i] =
(1 − ρ1)ρi
1(1 − ρ2)ρI−i
2
S(I)
• The normalization constant, S(I), is obtained by
S(I) = (1−ρ1)(1−ρ2)
I
i=0
ρi
1ρI−i
2 = (1−ρ1)(1−ρ2)ρI
2
1 − (ρ1/ρ2)I+1
1 − (ρ1/ρ2)
• We then have for 0 ≤ i ≤ I
Pr[N1 = i, N2 = I − i] =
1 − β
1 − βI+1
βi
where β = ρ1/ρ2 = µ2/((1 − p)µ1)
• Program completion rate is pλ1: λ1/µ1 = Pr[N1 = 0]
3-152
Arrival theorem for closed networks I
Theorem: In a closed Jackson network with M customers, the
occupancy distribution seen by a customer upon arrival at queue j is
the same as the occupancy distribution in a closed network with the
arriving customer removed
• In a closed network with M customers, the expected number of
customers found upon arrival by a customer at queue j is equal to
the average number of customers at queue j, when the total
number of customers in the closed network is M − 1
• An arriving customer sees the system at a state that does not
include itself
Proof:
• X(t) = [X1(t), X2(t), . . . , XK (t))]: state of the network at time t
• Tij(t): probability that a customer moves from queue i to j at
time t+
3-153
Arrival theorem for closed networks II
• For any state n with ni > 0, the conditional probability that a
customer moving from node i to j finds the network at state n
αij(n) = Pr[X(t) = n|Tij(t)] =
Pr[X(t) = n, Tij(t)]
Pr[Tij(t)]
=
Pr[Tij(t)|X(t) = n] Pr[X(t) = n]
m,mi >0 Pr[Tij(t)|X(t) = m] Pr[X(t) = m]
=
π(n)µipij
m,mi >0 π(m)µipij
=
ρn1
1 · · · ρni
i · · · ρnK
K
m,mi >0 ρm1
1 · · · ρmi
i · · · ρmK
K
– Changing mi = mi + 1, mi ≥ 0,
αij(n) =
ρn1
1 · · · ρni
i · · · ρnK
K
m1+···+mi +1+···+mK =M,
mi +1>0
ρm1
1 · · · ρ
mi +1
i · · · ρmK
K
=
ρn1
1 · · · ρni −1
i · · · ρnK
K
m1+···+mi
+···+mK =M−1,mi ≥0
ρm1
1 · · · ρ
mi
i · · · ρmK
K
=
ρn1
1 · · · ρni −1
i · · · ρnK
K
G(M − 1)
3-154
Mean Value Analysis I
Performance measure for closed networks with M customers
• Nj(M): average number of customers in queue j
• Tj(M): average time a customer spends (per visit) in queue j
• γj(M): average throughput of queue j
Mean-Value Analysis: Calculates Nj(M) and Tj(M) directly, without
first computing G(M) or deriving the stationary distribution of the
network
a) The queue length observed by an arriving customer is the same as
the queue length in a closed network with one less customer
b) Little’s result is applicable throughout the network
1. Based on a)
Tj(s) =
1
µj
(1 + Nj(s − 1)) for j = 1, . . . , K, s = 1, . . . , M
– Tj(0) = Nj(0) = 0 for j = 1, . . . , K
3-155
Mean Value Analysis II
2. Based on b), we first have when there are s customers in the
network
E[Nj(s)] = λj(s)E[Tj(s)] = λ(s)πjE[Tj(s)]
step 2-b
and
s =
K
j=1
E[Nj(s)] = λ(s)
K
j=1
πjE[Tj(s)] → λ(s) =
s
K
j=1 πjE[Tj(s)]
step 2-a
This will be iteratively done for s = 0, 1, . . . , M
3-156
Where are we?
Elementary queueing models
– M/M/1, M/M/C, M/M/C/C/K, ... and bulk queues
– either product-form solutions or use PGF
Intermediate queueing models (product-form solution)
– Time-reversibility of Markov process
– Detailed balance equations of time-reversible MCs
– Multidimensional Birth-death processes
– Network of queues: open- and closed networks
Advanced queueing models
– M/G/1 type queue: Embedded MC and Mean-value analysis
– M/G/1 with vacations and Priority queues
– G/M/m queue
More advanced queueing models (omitted)
– Algorithmic approaches to get steady-state solutions
3-157
Residual life time∗
I
Hitchhiker’s paradox:
Cars are passing at a point of a road according to a Poisson process
with rate λ = 1/10, i.e., 10 min.
A hitchhiker arrives to the roadside point at random instant of time.
Hitchhiker arrives
Next carPrevious car
time
What is his mean waiting time for the next car?
1. Since he arrives randomly in an interval, it would be 5 min.
2. Due to memoryless property of exponential distribution, it would be
another 10 min.
∗
L. Kleinrock, Queueing systems, vol.1: theory
3-158
Residual life time II
The distribution of an interval that the hitchhiker captures depends
on both X and fX (x):
fX (x) = CxfX (x) and C : proportional constant
Since
∞
0
fX (x)dx = 1, we have C = 1/E[X] = 1/X:
fX (x) =
xfX (x)
X
Since Pr[R < y|X = x] = y/x for 0 ≤ y ≤ x, joint pdf of X and R :
Pr[y < R < y + dy, x < X < x + dx] =
dy
x
xfX (x)dx
X
=
fX (x)dydx
X
Unconditioning over X ,
fR (y)dy =
dy
X
∞
y
fX (x)dx =
1 − FX (y)
X
dy ⇒ fR (y) =
1 − FX (y)
X
3-159
Residual life time III
If we take the Laplace transform of the pdf of R for 0 ≤ R ≤ x,
E[e−R s
|X = x] =
x
0
e−sx
x
dy =
1 − e−sx
sx
Unconditioning over X , we have R ∗
(s) and its moments as
R ∗
(s) =
1 − F∗
X (s)
sX
⇒ E[R n
] =
X(n+1)
(n + 1)X
where F∗
X (s) =
∞
0
e−sx
fX (t)dt.
Surprisingly, the distribution of the elapsed waiting time, X − R , is
identical to that of the remaining waiting time.
3-160
M/G/1 queue: Embedded MC I
Recall that a continuous-time MC is described by (n, r):
• n: number of customers in the system.
• r: attained or remaining service time of the customer in service.
Due to x, (n, x) is not a countable state space. How can we get rid of x?
What if we observe the system at the end of each service?
Xn+1 = max(Xn − 1, 0) + Yn+1
Xn: number of customers in the system left behind by a departure.
Yn: number of arrivals that occur during the service time of the
departing customer.
Question: Xn is equal to the queue length seen by an arriving
customer (queue length just before arrival)? Recall PASTA.
3-161
Distribution Upon Arrival or Departure
α(t), β(t): number of arrivals and departures (respectively) in (0, t)
Un(t): number of times the system goes from n to n + 1 in (0, t);
number of times an arriving customer finds n customers in the system
Vn(t): number of times that the system goes from n + 1 to n;
number of times a departing customer leaves n.
the transition n to n+1 cannot reoccur until after the number in the system drops to n once more
(i.e., until after the transition n +1 to n reoccurs)
Un(t) and Vn(t) differ by at most one: |Un(t) − Vn(t)| ≤ 1.
lim
t→∞
Un(t)
t
= lim
t→∞
Vn(t)
t
⇒ lim
t→∞
Un(t)
α(t)
α(t)
t
= lim
t→∞
Vn(t)
β(t)
β(t)
t
3-162
M/G/1 queue: Embedded MC II
Defining probability generating function of distribution Xn+1,
Qn+1(z) E[zXn+1
] = E[zmax(Xn−1,0)+Yn+1
] = E[zmax(Xn−1,0)
]E[zYn+1
]
Let Un+1(z) = E[zYn+1
], as n → ∞, Un+1(z) = U(z) (independent
of n). Then, we have
Qn+1(z) =U(z)
∞
k=0
zk
Pr[max(Xn − 1, 0) = k]
=U(z) z0
Pr[Xn = 0] +
∞
k=1
zk−1
Pr[zXn
= k]
=U(z) Pr[Xn = 0] + z−1
(Qn(z) − Pr[Xn = 0])
As n → ∞, we have Qn+1(z) = Qn(z) = Q(z), and Pr[Xn = 0] = q0,
Q(z) =
U(z)(z − 1)
z − U(z)
q0.
3-163
M/G/1 queue: Embedded MC III
We need to find U(z) and q0. Using U(z|xi = x) = eλx(z−1)
,
U(z) =
∞
0
U(z|xi = x)b(x)dx = B∗
(λ(1 − z)).
Since Q(1) = 1, we have q0 = 1 − U (1) = 1 − λ · X = 1 − ρ.
Transform version of P-K formula is
Q(z) =
B∗
(λ(1 − z))(z − 1)
z − B∗(λ(1 − z))
(1 − ρ).
Letting q = Q (1), one gets W = q/λ − X.
Sojourn time distribution of an M/G/1 system with FIFO service:
If a customer spends Tj sec in the system, the number of customers it
leaves behind in the system is the number of customers that arrive
during these Tj sec, due to FIFO.
3-164
M/G/1 Queue: Embedded MC IV
Let fT (t) be probability density function of T, i.e., total delay.
Q(z) =
∞
k=0
zk
∞
0
(λt)k
k!
e−λt
fT (t)dt = T∗
(λ(1 − z))
where T∗
(s) is the Laplace transform of fT (t). We have
T∗
(λ(1 − z)) =
B∗
(λ(1 − z))(z − 1)
z − B∗(λ(1 − z))
(1 − ρ)
Let s = λ(1 − z), one gets
T∗
(s) =
(1 − ρ)sB∗
(s)
s − λ + λB∗(s)
= W ∗
(s)B∗
(s) ⇒ W ∗
(s) =
(1 − ρ)s
s − λ + λB∗(s)
In an M/M/1 system, we have B∗
(s) = µ/(s + µ):
W ∗
(s) = (1 − ρ) 1 +
λ
s + µ − λ
3-165
Delay analysis of an ARQ system
Suppose Go-Back-N ARQ system, where a packet is successfully
transmitted with probability 1 − p
• Packet arrivals to a transmitter’s queue follows Poisson with mean
λ (packets/slot)
The first two moments of the service time are
x (X X)
x (00 00 00)- 2 k k k 2 2k
=(l-p) +n
We now note that
X
,",pk = _1_,
I-p
k=O
X
'"'kk _ _P_
P - (I _ )2'
k=O P
Effective service time
of packet 1
Effective service time
of packet 2
,. II .. 'II
Start of effective service time
of packet 4
Error Final transmission
of packet 1
Error Final transmission Correct Error
of packet 2
-Packets Transmitted
Error
Figure 3.17 Illustration of the effective service times of packets in the ARQ
system of Example 3.15. For example, packet 2 has an effective service time of
n + 1 because there was an error in the first attempt to transmit it following the
last transmission of packet 1. but no error in the second attempt.
• We need the first two moments of the service time to use P-K
formula
X =
∞
k=0
(1 + kW )(1 − p)pk
= 1 +
Wp
1 − p
X2 =
∞
k=0
(1 + kW )2
(1 − p)pk
= 1 +
2Wp
1 − p
+
W 2
(p + p2
)
(1 − p)2 3-166
M/G/1 queue: Mean value analysis I
For queueing systems with a general and independent service time
distribution, G, a continuous-time MC is described by (n, r):
• n: number of customers in the system.
• r: attained or remaining service time of the customer in service.
Wi = Ri +
i−1
j=i−Ni
Xj
where Wi, Ri and Xj are waiting time in queue of customer i, residual
service time seen by customer i, and service time of customer j.
Taking expectations and using the independence among Xj,
E[Wi] W = E[Ri] + E
i−1
j=i−Ni
E[Xj|Ni] = Ri +
1
µ
Nq
Since Nq = λW , Ri = R for all i, we have W = R/(1 − ρ).
3-167
M/G/1 queue: Mean value analysis II
Time averaged residual time of r(τ) in the interval [0, t] is
R(t) =
1
t
t
0
r(τ)dτ =
1
t
M(t)
i=1
1
2
X2
i =
1
2
M(t)
t
M(t)
i=1 X2
i
M(t)
– M(t) is the number of service completion within [0, t].
Residualservicetime
time
Upon a new service of duration , starts at and decays linearly for time units.
As t → ∞, R(∞) = R = λX2/2,
From the hitchhiker’s paradox, E[R ] = E[X2
]/(2E[X]):
R = 0 · Pr[N(t) = 0] + E[R ] Pr[N(t) > 0] =
E[X2
]
2E[X]
λE[X] =
λX2
2
.
3-168
M/G/1 queue: Mean value analysis III
Pollaczek-Khinchin (P-K) formula for mean waiting time in queue
W =
λX2
2(1 − ρ) X2=σ2
X
+X
2
=
λ(σ2
X + X
2
)
2(1 − ρ)
=
1 + C2
x
2
ρ
1 − ρ
X =
1 + C2
x
2
WM/M/1
– C2
x = σ2
X /E[X]2
is the coefficient of variation of the service time.
Eg.: since Cx = 1 in an M/M/1 and Cx = 0 in an M/D/1,
WM/M/1 =
ρ
1 − ρ
X > WM/D/1 =
ρ
2(1 − ρ)
X
3-169
M/G/1 Queue with vacations I
Server takes a vacation at the end of each busy period
• Take an additional vacation if no customers are found at the end of
each vacation: V1, V2, ... the durations of the successive vacations
• A customer finds the system idle (vacation), waits for the end of
the vacation period
Packet arrivals
x, x x v2 V3 X5 V4 X5
::::::::'':'::':'':'::'V,
Busy period
Vacations
Time
Figure 3.12 An M/G/I1 system with vacations. At the end of a busy
period, the server goes on vacation for time V with first and second moments
V and V
Y
, respectively. If the system is empty at the end of a vacation, the
server takes a new vacation. An arriving customer to an empty system must
wait until the end of the current vacation to get service.
X, X3
X,
Figure 3.13 Residual service times for an M/G/1 system with vacations.
Busy periods alternate with vacation periods.
II
• Residual service time including vacation periods
1
t
t
0
r(τ)dτ =
1
t
M(t)
i=1
1
2
X2
i +
1
t
L(t)
i=1
1
2
V 2
i
– M(t): # of services completed by time t
– L(t): # of vacations completed by time t 3-170
M/G/1 Queue with vacations II
• Residual service time including vacation periods is rewritten as
1
t
t
0
r(τ)dτ
R as t→∞
=
M(t)
t
λ as t→∞
·
M(t)
i=1
1
2 X2
i
M(t)
+
L(t)
t
1−ρ
V
as t→∞
·
L(t)
i=1
1
2 V 2
i
L(t)
=
λX2
2
+
(1 − ρ)V 2
2V
= R
• Using W = R/(1 − ρ), we have
W =
λX2
2(1 − ρ)
+
V 2
2V
– The sum of waiting time in M/G/1 queue and residual vacation
times
3-171
M/G/1 Queue with vacations: Embedded Markov
chain approach
Observing queue either at the end of a vacation or a service period
• αm: probability of m customers found at the end of a vacation
period
αk =
∞
0
(λx)k
k!
e−λx
v(x)dx
– v(x) is the pdf of the length of a vacation period
• ak: probability of k customers found at the end of a service time
ak =
∞
0
(λx)k
k!
e−λx
b(x)dx
• Combining the above, we have GBE as
πk = π0
k+1
m=1
αmak−m+1 +
k+1
j=1
πjak−j+1
3-172
Queueing theory
Queueing theory

More Related Content

What's hot

Exponential probability distribution
Exponential probability distributionExponential probability distribution
Exponential probability distributionMuhammad Bilal Tariq
 
ENTROPY STABLE ENO SCHEMES
ENTROPY STABLE ENO SCHEMESENTROPY STABLE ENO SCHEMES
ENTROPY STABLE ENO SCHEMESHamed Zakerzadeh
 
Random Number Generation
Random Number GenerationRandom Number Generation
Random Number GenerationRaj Bhatt
 
Network Layer design Issues.pptx
Network Layer design Issues.pptxNetwork Layer design Issues.pptx
Network Layer design Issues.pptxAcad
 
Gomory's cutting plane method
Gomory's cutting plane methodGomory's cutting plane method
Gomory's cutting plane methodRajesh Piryani
 
Queuing model
Queuing model Queuing model
Queuing model goyalrama
 
Probability Concept and Bayes Theorem
Probability Concept and Bayes TheoremProbability Concept and Bayes Theorem
Probability Concept and Bayes TheoremCherryBerry2
 
Bayseian decision theory
Bayseian decision theoryBayseian decision theory
Bayseian decision theorysia16
 
Unit.5. transportation and assignment problems
Unit.5. transportation and assignment problemsUnit.5. transportation and assignment problems
Unit.5. transportation and assignment problemsDagnaygebawGoshme
 
Time advance mehcanism
Time advance mehcanismTime advance mehcanism
Time advance mehcanismNikhil Sharma
 
9 fault-tolerance
9 fault-tolerance9 fault-tolerance
9 fault-tolerance4020132038
 
Probability distribution in R
Probability distribution in RProbability distribution in R
Probability distribution in RAlichy Sowmya
 
CS9222 ADVANCED OPERATING SYSTEMS
CS9222 ADVANCED OPERATING SYSTEMSCS9222 ADVANCED OPERATING SYSTEMS
CS9222 ADVANCED OPERATING SYSTEMSKathirvel Ayyaswamy
 
Random number generators
Random number generatorsRandom number generators
Random number generatorsBob Landstrom
 
ELEMENTS OF TRANSPORT PROTOCOL
ELEMENTS OF TRANSPORT PROTOCOLELEMENTS OF TRANSPORT PROTOCOL
ELEMENTS OF TRANSPORT PROTOCOLShashank Rustagi
 
Forecasting exponential smoothing
Forecasting exponential smoothingForecasting exponential smoothing
Forecasting exponential smoothingDoiyan
 
Distributed Mutual Exclusion and Distributed Deadlock Detection
Distributed Mutual Exclusion and Distributed Deadlock DetectionDistributed Mutual Exclusion and Distributed Deadlock Detection
Distributed Mutual Exclusion and Distributed Deadlock DetectionSHIKHA GAUTAM
 

What's hot (20)

Exponential probability distribution
Exponential probability distributionExponential probability distribution
Exponential probability distribution
 
ENTROPY STABLE ENO SCHEMES
ENTROPY STABLE ENO SCHEMESENTROPY STABLE ENO SCHEMES
ENTROPY STABLE ENO SCHEMES
 
Random Number Generation
Random Number GenerationRandom Number Generation
Random Number Generation
 
Network Layer design Issues.pptx
Network Layer design Issues.pptxNetwork Layer design Issues.pptx
Network Layer design Issues.pptx
 
Gomory's cutting plane method
Gomory's cutting plane methodGomory's cutting plane method
Gomory's cutting plane method
 
Queuing model
Queuing model Queuing model
Queuing model
 
Probability Concept and Bayes Theorem
Probability Concept and Bayes TheoremProbability Concept and Bayes Theorem
Probability Concept and Bayes Theorem
 
Bayseian decision theory
Bayseian decision theoryBayseian decision theory
Bayseian decision theory
 
Unit.5. transportation and assignment problems
Unit.5. transportation and assignment problemsUnit.5. transportation and assignment problems
Unit.5. transportation and assignment problems
 
Time advance mehcanism
Time advance mehcanismTime advance mehcanism
Time advance mehcanism
 
9 fault-tolerance
9 fault-tolerance9 fault-tolerance
9 fault-tolerance
 
Probability distribution in R
Probability distribution in RProbability distribution in R
Probability distribution in R
 
Bayes Theorem
Bayes TheoremBayes Theorem
Bayes Theorem
 
CS9222 ADVANCED OPERATING SYSTEMS
CS9222 ADVANCED OPERATING SYSTEMSCS9222 ADVANCED OPERATING SYSTEMS
CS9222 ADVANCED OPERATING SYSTEMS
 
Data Dissemination and Synchronization
Data Dissemination and SynchronizationData Dissemination and Synchronization
Data Dissemination and Synchronization
 
Random number generators
Random number generatorsRandom number generators
Random number generators
 
The medium access sublayer
 The medium  access sublayer The medium  access sublayer
The medium access sublayer
 
ELEMENTS OF TRANSPORT PROTOCOL
ELEMENTS OF TRANSPORT PROTOCOLELEMENTS OF TRANSPORT PROTOCOL
ELEMENTS OF TRANSPORT PROTOCOL
 
Forecasting exponential smoothing
Forecasting exponential smoothingForecasting exponential smoothing
Forecasting exponential smoothing
 
Distributed Mutual Exclusion and Distributed Deadlock Detection
Distributed Mutual Exclusion and Distributed Deadlock DetectionDistributed Mutual Exclusion and Distributed Deadlock Detection
Distributed Mutual Exclusion and Distributed Deadlock Detection
 

Viewers also liked

solution manual of goldsmith wireless communication
solution manual of goldsmith wireless communicationsolution manual of goldsmith wireless communication
solution manual of goldsmith wireless communicationNIT Raipur
 
Probability Theory and Mathematical Statistics in Tver State University
Probability Theory and Mathematical Statistics in Tver State UniversityProbability Theory and Mathematical Statistics in Tver State University
Probability Theory and Mathematical Statistics in Tver State Universitymetamath
 
Stochastic modelling and its applications
Stochastic modelling and its applicationsStochastic modelling and its applications
Stochastic modelling and its applicationsKartavya Jain
 
Theories of Dissolution
Theories of DissolutionTheories of Dissolution
Theories of DissolutionPRASHANT DEORE
 
Queuing theory
Queuing theoryQueuing theory
Queuing theoryAmit Sinha
 
Queuing Theory - Operation Research
Queuing Theory - Operation ResearchQueuing Theory - Operation Research
Queuing Theory - Operation ResearchManmohan Anand
 
QUEUING THEORY
QUEUING THEORYQUEUING THEORY
QUEUING THEORYavtarsingh
 
cellular concepts in wireless communication
cellular concepts in wireless communicationcellular concepts in wireless communication
cellular concepts in wireless communicationasadkhan1327
 
T Test For Two Independent Samples
T Test For Two Independent SamplesT Test For Two Independent Samples
T Test For Two Independent Samplesshoffma5
 

Viewers also liked (11)

Availability
AvailabilityAvailability
Availability
 
solution manual of goldsmith wireless communication
solution manual of goldsmith wireless communicationsolution manual of goldsmith wireless communication
solution manual of goldsmith wireless communication
 
Probability Theory and Mathematical Statistics in Tver State University
Probability Theory and Mathematical Statistics in Tver State UniversityProbability Theory and Mathematical Statistics in Tver State University
Probability Theory and Mathematical Statistics in Tver State University
 
Stochastic modelling and its applications
Stochastic modelling and its applicationsStochastic modelling and its applications
Stochastic modelling and its applications
 
Theories of Dissolution
Theories of DissolutionTheories of Dissolution
Theories of Dissolution
 
Queuing theory
Queuing theoryQueuing theory
Queuing theory
 
Queuing Theory - Operation Research
Queuing Theory - Operation ResearchQueuing Theory - Operation Research
Queuing Theory - Operation Research
 
QUEUING THEORY
QUEUING THEORYQUEUING THEORY
QUEUING THEORY
 
cellular concepts in wireless communication
cellular concepts in wireless communicationcellular concepts in wireless communication
cellular concepts in wireless communication
 
Queueing theory
Queueing theoryQueueing theory
Queueing theory
 
T Test For Two Independent Samples
T Test For Two Independent SamplesT Test For Two Independent Samples
T Test For Two Independent Samples
 

Similar to Queueing theory

Stat150 spring06 markov_cts
Stat150 spring06 markov_ctsStat150 spring06 markov_cts
Stat150 spring06 markov_ctsmbfrosh
 
lecture 10 formal methods in software enginnering.pptx
lecture 10 formal methods in software enginnering.pptxlecture 10 formal methods in software enginnering.pptx
lecture 10 formal methods in software enginnering.pptxSohaibAlviWebster
 
Proceedings A Method For Finding Complete Observables In Classical Mechanics
Proceedings A Method For Finding Complete Observables In Classical MechanicsProceedings A Method For Finding Complete Observables In Classical Mechanics
Proceedings A Method For Finding Complete Observables In Classical Mechanicsvcuesta
 
Recursive State Estimation AI for Robotics.pdf
Recursive State Estimation AI for Robotics.pdfRecursive State Estimation AI for Robotics.pdf
Recursive State Estimation AI for Robotics.pdff20220630
 
12 Machine Learning Supervised Hidden Markov Chains
12 Machine Learning  Supervised Hidden Markov Chains12 Machine Learning  Supervised Hidden Markov Chains
12 Machine Learning Supervised Hidden Markov ChainsAndres Mendez-Vazquez
 
Theoryofcomp science
Theoryofcomp scienceTheoryofcomp science
Theoryofcomp scienceRaghu nath
 
Papoulis%20 solutions%20manual%204th%20edition 1
Papoulis%20 solutions%20manual%204th%20edition 1Papoulis%20 solutions%20manual%204th%20edition 1
Papoulis%20 solutions%20manual%204th%20edition 1awsmajeedalawadi
 
Block Cipher vs. Stream Cipher
Block Cipher vs. Stream CipherBlock Cipher vs. Stream Cipher
Block Cipher vs. Stream CipherAmirul Wiramuda
 
Phase-Type Distributions for Finite Interacting Particle Systems
Phase-Type Distributions for Finite Interacting Particle SystemsPhase-Type Distributions for Finite Interacting Particle Systems
Phase-Type Distributions for Finite Interacting Particle SystemsStefan Eng
 
DissertationSlides169
DissertationSlides169DissertationSlides169
DissertationSlides169Ryan White
 
Effects of poles and zeros affect control system
Effects of poles and zeros affect control systemEffects of poles and zeros affect control system
Effects of poles and zeros affect control systemGopinath S
 
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithmsRao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithmsChristian Robert
 

Similar to Queueing theory (20)

Stat150 spring06 markov_cts
Stat150 spring06 markov_ctsStat150 spring06 markov_cts
Stat150 spring06 markov_cts
 
lecture 10 formal methods in software enginnering.pptx
lecture 10 formal methods in software enginnering.pptxlecture 10 formal methods in software enginnering.pptx
lecture 10 formal methods in software enginnering.pptx
 
Probable
ProbableProbable
Probable
 
Proceedings A Method For Finding Complete Observables In Classical Mechanics
Proceedings A Method For Finding Complete Observables In Classical MechanicsProceedings A Method For Finding Complete Observables In Classical Mechanics
Proceedings A Method For Finding Complete Observables In Classical Mechanics
 
stochastic processes assignment help
stochastic processes assignment helpstochastic processes assignment help
stochastic processes assignment help
 
Mcqmc talk
Mcqmc talkMcqmc talk
Mcqmc talk
 
Jere Koskela slides
Jere Koskela slidesJere Koskela slides
Jere Koskela slides
 
Recursive State Estimation AI for Robotics.pdf
Recursive State Estimation AI for Robotics.pdfRecursive State Estimation AI for Robotics.pdf
Recursive State Estimation AI for Robotics.pdf
 
Microeconomics-Help-Experts.pptx
Microeconomics-Help-Experts.pptxMicroeconomics-Help-Experts.pptx
Microeconomics-Help-Experts.pptx
 
Semi markov process
Semi markov processSemi markov process
Semi markov process
 
12 Machine Learning Supervised Hidden Markov Chains
12 Machine Learning  Supervised Hidden Markov Chains12 Machine Learning  Supervised Hidden Markov Chains
12 Machine Learning Supervised Hidden Markov Chains
 
Theoryofcomp science
Theoryofcomp scienceTheoryofcomp science
Theoryofcomp science
 
Papoulis%20 solutions%20manual%204th%20edition 1
Papoulis%20 solutions%20manual%204th%20edition 1Papoulis%20 solutions%20manual%204th%20edition 1
Papoulis%20 solutions%20manual%204th%20edition 1
 
Block Cipher vs. Stream Cipher
Block Cipher vs. Stream CipherBlock Cipher vs. Stream Cipher
Block Cipher vs. Stream Cipher
 
Markov chain
Markov chainMarkov chain
Markov chain
 
Phase-Type Distributions for Finite Interacting Particle Systems
Phase-Type Distributions for Finite Interacting Particle SystemsPhase-Type Distributions for Finite Interacting Particle Systems
Phase-Type Distributions for Finite Interacting Particle Systems
 
DissertationSlides169
DissertationSlides169DissertationSlides169
DissertationSlides169
 
Effects of poles and zeros affect control system
Effects of poles and zeros affect control systemEffects of poles and zeros affect control system
Effects of poles and zeros affect control system
 
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithmsRao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
Rao-Blackwellisation schemes for accelerating Metropolis-Hastings algorithms
 
Stochastic Processes Assignment Help
Stochastic Processes Assignment HelpStochastic Processes Assignment Help
Stochastic Processes Assignment Help
 

Recently uploaded

VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130Suhani Kapoor
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )Tsuyoshi Horigome
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...srsj9000
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxupamatechverse
 
Analog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAnalog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAbhinavSharma374939
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Serviceranjana rawat
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escortsranjana rawat
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxwendy cai
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSKurinjimalarL3
 
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130Suhani Kapoor
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSSIVASHANKAR N
 

Recently uploaded (20)

VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
 
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
 
SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptx
 
Analog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAnalog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog Converter
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptx
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
 
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
 
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
VIP Call Girls Service Kondapur Hyderabad Call +91-8250192130
 
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLSMANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
MANUFACTURING PROCESS-II UNIT-5 NC MACHINE TOOLS
 

Queueing theory

  • 1. ELL 785–Computer Communication Networks Lecture 3 Introduction to Queueing theory 3-1 Contents Review on Poisson process Discrete-time Markov processes Continuous-time Markov processes Queueing systems 3-2 Review on Poisson process I Properties of a Poisson process, Λ(t): P1) Independent increment for some finite λ (arrivals/sec): Number of arrivals in disjoint intervals, e.g., [t1, t2] and [t3, t4], are independent random variables. Its probability density function is Pr[Λ(t) = k] = (λt)k k! e−λt for k = 0, 1, . . . P2) Stationary increments: The number of events (or arrivals) in (t, t + h] is independent of t. Using the prob. generating function of distribution Λ(t + h), i.e., E[zΛ(t) ] = ∞ k=0 zk Pr[Λ(t) = k] = eλt(z−1) , E[zΛ(t+h) ] = E[zΛ(t) · zΛ(t+h)−Λ(t) ] = E[zΛ(t) ] · E[zΛ(t+h)−Λ(t) ], due to P1. ⇒ E[zΛ(t+h)−Λ(t) ] = eλ(t+h)(z−1) eλ(t)(z−1) = eλ(h)(z−1) . 3-3 Review on Poisson process II P3) Interarrival (or inter-occurrence) times between Poisson arrivals are exponentially distributed: Suppose τ1, τ2, τ2, . . . are the epochs of the first, second and third arrivals, then the interarrival times t1, t2 and t3 are given by t1 = τ1, t2 = τ2 − τ1 and t3 = τ3 − τ2, generally, tn = τn − τn−1 with τ0 = 0. time 1. For t1, we have Pr[t1 ≥ t] = Pr[Λ(t) = 0] = e−λt for t ≥ 0, which means that t1 is exponentially distributed with mean 1/λ. 2. For t2, we get Pr[t2 > t|t1 = x] = Pr[Λ(t+x)−Λ(x) = 0] = Pr[Λ(t) = 0] = e−λt , which also means that t2 is independent of t1 and has the same distribution as t1. Similarly t3, t4, . . . are iid. 3-4
  • 2. Review on Poisson process III P4) The converse of P4 is true: If the sequence of interarrival times {ti} is iid rv’s with exp. density fun. λe−λt , t ≥ 0, then the number of arrivals in interval [0, t], Λ(t), is a Poisson process. Let Y denote the sum of j independent rv’s with exp. density fun., then Y is Erlang-j distributed, fY (y) = λ(λy)j−1 (j−1)! e−λy : Pr[Λ(t) = j] = t 0 Pr[0 arrival in(y, t]|Y = y]fY (y)dy = t 0 eλ(t−y) · fY (y)dy = (λt)j e−λt j! .≈ time 3-5 Review on Poisson process IV P5) For a short interval, the probability that an arrival occurs in an interval is proportional to the interval size, i.e., lim h→0 Pr[Λ(h) = 1] h = lim h→0 e−λh (λh) h = λ. Or, we have Pr[Λ(h) = 1] = λh + o(h), where limh→0 o(h) h = 0 P6) The probability of two or more arrivals in an interval of length h gets small as h → 0. For every t ≥ 0, lim h→0 Pr[Λ(h) ≥ 2] h = lim h→0 1 − e−λh − λhe−λh h = 0 3-6 Review on Poisson process V P7) Merging: If Λi(t)’s are mutually independent Poisson processes with rates λi’s, the superposition process Λ(t) = k i=1 Λi(t) is a Poisson process with rate λ = k i=1 λi Note: If the interarrival times of the ith stream are a sequence of iid rv’s but not necessarily exponentially distributed, then Λ(t) tends to a Poisson process as k → ∞. [D. Cox, Renewal Theory] … Merging … Splitting P8) Splitting: If an arrival randomly chooses the ith branch with probability πi, the arrival process at the ith branch, Λi(t), is Poisson with rate λi(= πiλ). Moreover, Λi(t) is independent of Λj(t) for any pair of i and j (i = j). 3-7 Poisson approximation to Binomial distribution • Poisson distribution approximates the binomial probabilities – If n is large and p is small, by keeping G = np fixed, pk = n k pk (1 − p)n−k ≈ Gk k! e−G for k = 0, 1, . . . , – The probability that no events occur in n trials p0 = (1 − p)n = 1 − G n n → e−G as n → ∞ – The rest of probabilities can be found pk+1 pk = n k+1 pk+1 qn−k−1 n k pkqn−k = k!(n − k)!p (k + 1)!(n − k − 1)!q = (n − k)p (k + 1)q = (1 − k/n)α (k + 1)(1 − G/n) → G k + 1 as n → ∞ =⇒ pk = Gk k! e−G 3-8
  • 3. Random (or Stochastic) Processes General notion • Suppose a random experiment specified by the outcomes ζ from some sample space S, and ζ ∈ S • A random process (or stochastic) is a mapping ζ to a function of time t: X(t, ζ) – For a fixed t, e.g., t1, t2,...: X(ti, ζ) is random variable – For ζ fixed: X(t, ζi) is a sample path or realization 0 1 2 n n+1 time ≈ n+2 – e.g. # of frames in a transmitter’s queue, # of rickshaws at IIT main gate 3-9 Discrete-time Markov process I A sequence of integer-valued random variables, Xn, n = 0, 1, . . . , is called a discrete-time Markov process If the following Markov property holds Pr[Xn+1 = j|Xn = i, Xn−1 = in−1, . . . , X0 = i0] = Pr[Xn+1 = j|Xn = i] • State: the value of Xn at time n in the set S • State space: the set S = {n|n = 0, 1, . . . , } – An integer-valued Markov process is called Markov chain (MC) Time-homogeneous, if for any n, pij = Pr[Xn+1 = j|Xn = i] (indepedent of time n) which is called one-step (state) transition probability 3-10 Discrete-time Markov process II State transition probability matrix: P = [pij] =           p00 p01 p02 · · · · · · p10 p11 p12 · · · · · · · · · · · · · · · · · pi0 pi1 pi2 · · · · · · ... ... ... ... ...           which is called a stochastic matrix with pij ≥ 0 and ∞ j=0 pij = 1 n-step transition probability matrix: Pn = p (n) ij with n-step transition probabilities p (n) ij = Pr[Xl+n = j|Xl = i] for n ≥ 0, i, j ≥ 0. 3-11 Discrete-time Markov process III The Chapman-Kolmogorov equations: p (n+m) ij = ∞ k=0 p (n) ik p (m) kj for n, m ≥ 0, i, j ∈ S Proof: Pr[Xn+m = j|X0 = i] = k∈S Pr[Xn+m = j|X0 = i, Xn = k] Pr[Xn = k|X0 = i] (Markov property) = k∈S Pr[Xn+m = j|Xn = k] Pr[Xn = k|X0 = i] (Time homogeneous) = k∈S Pr[Xm = j|X0 = k] Pr[Xn = k|X0 = i] Pn+m = Pn Pm ⇒ Pn+1 = Pn P 3-12
  • 4. Discrete-time Markov process IV A mouse in a maze 4 5 6 8 1 2 3 7 9 • A mouse chooses the next cell to visit with probability 1/k, where k is the number of adjacent cells. • The mouse does not move any more once it is caught by the cat or it has the cheese. P = 1 2 3 4 5 6 7 8 9                       1 0 1 2 0 1 2 0 0 0 0 0 2 1 3 0 1 3 0 1 3 0 0 0 0 3 0 1 2 0 0 0 1 2 0 0 0 4 1 3 0 0 0 1 3 0 1 3 0 0 5 0 1 4 0 1 4 0 1 4 0 1 4 0 6 0 0 1 3 0 1 3 0 0 0 1 3 7 0 0 0 0 0 0 1 0 0 8 0 0 0 0 1 3 0 1 3 0 1 3 9 0 0 0 0 0 0 0 0 1 3-13 Discrete-time Markov process V In a place, the weather each day is classified as sunny, cloudy or rainy. The next day’s weather depends only on the weather of the present day and not on the weather of the previous days. If the present day is sunny, the next day will be sunny, cloudy or rainy with respective probabilities 0.70, 0.10 and 0.20. The transition probabilities are 0.50, 0.25 and 0.25 when the present day is cloudy; 0.40, 0.30 and 0.30 when the present day is rainy. 0.7 0.2 0.1 Sunny Cloudy Rainy 0.5 0.25 0.25 0.4 0.3 0.3 P = S C R     S 0.7 0.1 0.2 C 0.5 0.25 0.25 R 0.4 0.3 0.3 – Using n-step transition probability matrix, P3 =    0.601 0.168 0.230 0.596 0.175 0.233 0.585 0.179 0.234    and P12 =    0.596 0.172 0.231 0.596 0.172 0.231 0.596 0.172 0.231    = P13 3-14 Discrete-time Markov process VI State probabilities at time n – π (n) i = Pr[Xn = i] and π(n) = π (n) 0 , . . . , π (n) i , . . .](row vector) – π (0) i : the initial state probability Pr[Xn = j] = i∈S Pr[Xn = j|X0 = i] Pr[X0 = i] π (n) j = i∈S p (n) ij π (0) i – In matrix notation: π(n) = π(0) Pn Limiting distribution: Given an initial prob. distribution, π(0) , π = lim n→∞ π(n) → π (∞) j = lim n→∞ p (n) ij – n → ∞: π(n) = π(n−1) P → π = πP and π · 1 = 1 – The system reaches “equilibrium" or “steady-state" 3-15 Discrete-time Markov process VII Stationary distribution: – zj and z = [zj] denote the prob. of being in state j and its vector z = z · P and z · 1 = 1 • If zj is chosen as the initial distribution, i.e., π (0) j = zj for all j, we have π (n) j = zj for all n • A limiting distribution, when it exists, is always a stationary distribution, but the converse is not true P = 0 1 1 0 , P2 = 1 0 0 1 , P3 = 0 1 1 0 Global balance equation: π = πP ⇒ (each row) πj i pji = i πipij 3-16
  • 5. Discrete-time Markov process VIII Back to the weather example on page 3-16 • Using πP = π, we have π0 =0.7π0 + 0.5π1 + 0.4π2 π1 =0.1π0 + 0.25π1 + 0.3π2 π2 =0.2π0 + 0.25π1 + 0.3π2 - Note that one equation is always redundant • Using 1 = π0 + π1 + π2, we have    0.3 −0.5 −0.4 −0.1 0.75 −0.3 1 1 1       π0 π1 π2    =    0 0 1    π0 = 0.596, π1 = 0.1722, π2 = 0.2318 3-17 Discrete-time Markov process IX Classes of states: • State j is accessible from state i if p (n) ij > 0 for some n • States i and j communicate if they are accessible to each other • Two states belong to the same class if they communicate with each other • MC having a single class is said to be irreducible 1 2 0 3 Recurrence property • State j is recurrent if ∞ n=1 p (n) jj = ∞ – Positive recurrent if πj > 0 – Null recurrent if πj = 0 • State j is transient if ∞ n=1 p (n) jj < ∞ 3-18 Discrete-time Markov process X Periodicity and aperiodic: • State i has period d if p (n) ii = 0 when n is not a multiple of d, where d is the largest integer with this property. • State i is aperiodic if it has period d = 1. • All states in a class have the same period – An irreducible Markov chain is said to be aperiodic if the states in its single class have period one State Recurrent Transient: Positive recurrent Periodic: Aperiodic: Null recurrent: ergodic 3-19 Discrete-time Markov process XI In a place, a mosquito is produced every hour with prob. p, and dies with prob. 1 − p 0 1 2 3 … … • Using global balance equations, pπi = (1 − p)πi+1 → πi+1 = p 1 − p πi = p 1 − p i π0 All states are positive recurrent if p < 1/2, null recurrent if p = 1/2 (see ∞ i=0 πi = 1), and transient if p > 1/2 3-20
  • 6. Drift and Stability I Suppose an irreducible, aperiodic, discrete-time MC • The chain is ‘stable’ if πj > 0 for all j • Drift is defined as Di = E[Xn+1 − Xn|Xn = i] = ∞ k=−i kPi(i+k) – If Di > 0, the process goes up some higher states from state i – If Di < 0, the process visits some lower states from state i – In the previous slide, Di = 1 · p − 1 · (1 − p) = 2p − 1 Pakes’ lemma 1) Di < ∞ for all i 2) For some scalar δ > 0 and integer i ≥ 0 Di ≤ −δ for all i > i Then, the MC has a stationary distribution 3-21 Drift and Stability II Proof: β = maxi≤i Di (on page 264 in the textbook) E[Xn|X0] − i = E[Xn − Xn−1 + Xn−1 − Xn−2 + · · · + X1 − X0|X0 = i] = n k=1 E[Xk − Xk−1|X0 = i] = n k=1 ∞ j=0 E[Xk − Xk−1|Xk−1 = j] Pr[Xk−1 = j|X0 = i] ≤ n k=1 β i j=0 Pr[Xk−1 = j|X0 = i] − δ 1 − i j=0 Pr[Xk−1 = j|X0 = i] = (β + δ) n k=1 i j=0 Pr[Xk−1 = j|X0 = i] − nδ 3-22 Drift and Stability III from which we can get 0 ≤ E[Xn|X0 = i] ≤ n (β + δ) i j=0 1 n n k=1 Pr[Xk−1 = j|X0 = i] p (k) ij −δ + i Dividing by n and as n → ∞ yields 0 ≤ (β + δ) i j=0 πj − δ – πj = limn→∞ 1 n n k=1 p (k) ij (Cesaro limit) = limn→∞ p (n) ij For some j ∈ {0, . . . , i}, we have πj > 0 3-23 Computational methods I Infinite-state MC • Probability generating function (PGF) of a probability distribution G(z) = E[zX ] = ∞ i=0 zi Pr[X = i] πi – G(1) = 1 – dG(z)/dz|z=1 = X, and d2 G(z)/dz2 |z=1 = X2 − X • In example on slide 3-22, G(z) = π0 ∞ i=0 pz 1 − p i = 1 − p (1 − p) − zp π0 Finite-state MC • Direct method – π = πP → π(P − I) = 0 and π · 1 = 1 → π · E = 1 – E is a matrix of all ones π(P + E − I) = 1 ⇒ π = (P + E − I)−1 1 3-24
  • 7. Computational methods II • Successive overrelaxation πi = N k=0 pkiπk −→ πi = N k=0,k=i akiπk, aki = pki/(1 − pii) 1. Choose π (1) i such that N i=0 π (1) i = 1, and 0 ≤ ω ≤ 2 2. For each iteration k, compute π (k) i = (1 − ω)π (k−1) i + ω i−1 j=0 aijπ (k) j + N j=i+1 aijπ (k−1) j For ω = 1, this iteration becomes Gauss-Seidel method 3. Check if the following is satisfied N i=0 π (k) i − π (k−1) i ≤ N i=0 π (k) i go to Step 4. Otherwise go to Step 2. 4. Compute the state probabilities as π∗ i = π (k) i / N j=0 π (k) j 3-25 Speech model I A Markov model for packet speech assumes that if the nth packet contains silence, then the probability of silence in the next packet is 1 − α and the probability of speech activity is α. Similarly, if the nth packet contains speech activity, then the probability of speech activity in the next packet is 1 − β and the probability of silence is β. ≈ ≈ talkspurt silence A frame time • Find the state transition probability matrix, P P = silence talkspurt silence 1 − α α talkspurt β 1 − β 3-26 Speech model II What if a frame is generated based on the previous Markov speech model at a transmitter during talkspurts with probability while each frame is successfully transmitted with probability p at each time slot? Assume that ACK/NAK come before the beginning of the next slot • Two-dimensional Markov chain – {(k, i): k is # of frames and i is the source state} – Queue length evolves as Qn+1 = max(Qn − Tn, 0) + An … … 3-27 Simplified Google page rank model A Web surfer browses pages in a five-page Web universe shown below. The surfer selects the next page to view by selecting with equal probability from the pages pointed to by the current page. If a page has no outgoing link (e.g., page 2), then the surfer selects any of the pages in the universe with equal probability. Find the probability that the surfer views page i 1 4 3 2 5 3-28
  • 8. Summary of discrete-time MC (DTMC) Stochastic modeling with a DTMC Inspect whether Xn is a Markov process or not Build a state transition probability matrix, P (or diagram) Examine under what condition Xn is stable – e.g., use the drift for n ≥ n , Dn(= expected input rate - expected output rate) < 0 Solve π · P = π and π · 1 = 1 or πj = i πipij Compute other metrics with π, when necessary 3-29 Continuous-time Markov process I Memoryless property of exponential dist., F(x) = 1 − e−µx for x ≥ 0. Pr[X ≤ x0 + x|X > x0] = Pr[x0 < X ≤ x0 + x] Pr[X > x0] = Pr[X ≤ x0 + x] − Pr[X ≤ x0] 1 − Pr[X ≤ x0] = 1 − e−µx = Pr[X ≤ x], which is not a function of x0. Example: Suppose X is the length of a telephone conversation exponentially distributed with mean 5 minutes, (µ = 1/5). Given that the conversation has been going on for 20 minutes (x0 = 20), the probability that it continues for at most 10 minutes (x = 10), is given by Pr[X ≤ x0 + x|X > x0] = Pr[X ≤ x] = 1 − e−10/5 . In fact, because of this property, we shall see that the Markovian queueing system can be completely specified by the number of customers in the system. A similar result holds for the geometric distribution. 3-30 Continuous-time Markov process II A stochastic process is called continuous-time MC if it satisfies Pr[X(tk+1) = xk+1|X(tk) = xk, X(tk−1) = xk−1, . . . , X(t1) = x1] = Pr[X(tk+1) = xk+1|X(tk) = xk] If X(t) is a time-homogeneous continuous-time MC if Pr[X(t + s) = j|X(s) = i] = pij(t) (independent of s) which is analogous to pij in a discrete-time MC time 1 2 3 4 : sojourn time in state : time of state change A sample path of continuous time MC 3-31 Continuous-time Markov process III State occupancy time – Let Ti be the sojourn (or occupancy) time of X(t) in state i before making a transition to any other state. For all s ≥ 0 and t ≥ 0, due to Markovian property of this process, Pr[Ti > s + t|Ti > s] = Pr[Ti > t]. Only exponential dist. satisfies this property, i.e., Pr[Ti > t] = e−vi t . State transition rate qii(δ) = Pr[the process remains in state i during δ sec] = Pr[Ti > δ] = e−vi δ = 1 − viδ 1 + (viδ)2 2! − · · · = 1 − viδ + o(δ) Or, equivalently, the rate that the process moves out of state i, lim δ→0 1 − qii(δ) δ = lim δ→0 viδ + o(δ) δ = vi 3-32
  • 9. Continuous-time Markov Process IV Comparison between discrete- and continuous time MC time 1 2 8 : sojourn time in state : time of state change ≈ ≈ time 1 8 ≈ ≈ ≈ : discrete-time Markov process : continuous-time Markov process 3-33 Continuous-time Markov Process V A discrete-time MC is embedded in a continuous-time MC. time 1 2 4 : continuous-time Markov process 3 Each time a state, say i, is entered, an exponentially distributed state occupancy time is selected. When the time is up, the next state j is selected according to transition probabilities, ˜pij When the process enters state j from state i, qij(δ) = (1 − qjj(δ))˜pij = vi ˜pijδ + o(δ) = γijδ + o(δ), where γij = limδ→0 qij(δ)/δ = vi ˜pij, i.e., rate from state i to j. 3-34 Continuous-time Markov process V State probabilities πj(t) = Pr[X(t) = j]. For δ > 0, πj(t + δ) = Pr[X(t + δ) = j] = i Pr[X(t + δ) = j|X(t) = i] =qij (δ) Pr[X(t) = i] = i qij(δ)πi(t) ⇐⇒ πi = j pjiπj (DTMC) Transition into state j time state 3-35 Continuous-time Markov process VI Subtracting πj(t) from both sides, πj(t + δ) − πj(t) = i qij(δ)πi(t) − πj(t) = i qij(δ)πi(t) + (qjj(δ) − 1)πj(t) Dividing both sides by δ, lim δ→0 πj(t + δ) − πj(t) δ = dπj(t) dt = lim δ→0 1 δ i qij(δ)πi(t) + (qjj(δ) − 1) γii =−vi πj(t) = i γijπi(t), which is a form of the Chapman-Kolmogorov equations dπj(t) dt = i γijπi(t) 3-36
  • 10. Continuous-time Markov process VI A queueing system alternates between two states. In state 0, the system is idle and waiting for a customer to arrive. This idle time is an exponential random variable with mean 1/α. In state 1, the system is busy servicing a customer.The time in the busy state is an exponential random variable with mean 1/β. Find the state probabilities and in terms of the initial state probabilities π0(0) and π1(0). • γ00 = −α, γ01 = α, γ10 = β, γ11 = −β • From dπj (t) dt = i γijπi(t), π0(t) = −απ0(t) + βπ1(t) π1(t) = απ0(t) − βπ1(t) • Using π0(t) + π1(t) = 1, we have π0(t) = −απ0(t) + β(1 − π0(t)) and π0(0) = p0 • The general solution of the above is π0(t) = β α + β + Ce−(α+β)t and C = p0 − β α + β 3-37 Continuous-time Markov process VII As t → ∞, the system reaches ‘equilibrium’ or ‘steady-state’ dπj(t) dt → 0 and πj(∞) = πj 0 = i γijπi or vjπj = i=j γijπi γjj = −vj = − i=j γij which is called the global balance equation, and j πj = 1. … … …… 3-38 Continuous-time Markov process VIII As a matrix form, dπ(t) dt = π(t)Q and π(t)1 = 1 whose solution is given by π(t) = π(0)eQt As t → ∞, π(∞) π = [πi], πQ = 0 with Q =      −v0 γ01 γ02 γ03 . . . γ10 −v1 γ12 γ13 . . . γ20 γ21 −v2 γ23 . . . ... ... ... ... ...      and π · 1 = 1, where Q is called the infinitesimal generator or rate matrix. 3-39 Example: Barber shop I Customers arrive at a Barbor shop with a Poisson process with rate λ. One barber serves those customers based on first-come first-serve basis. Its service time, Si is exponentially distributed with 1/µ (sec). The number of customers in the system, N(t) for t ≥ 0, forms a Markov chain N(t + τ) = max(N(t) − B(τ), 0) + A(τ) State transition probabilities (see properties of Poisson process): Pr[0 arrival (or departure) in (t, t + δ)] = 1 − λδ + o(δ) (or 1 − µδ + o(δ)) Pr[1 arrival (or deparutre) in (t, t + δ)] = λδ + o(δ) (or µδ + o(δ)) Pr[more than 1 arrivals (or departure) in (t, t + h)] = o(h) 3-40
  • 11. Example: Barber shop II Find Pn(t) Pr[N(t) = n]. For n ≥ 1 Pn(t + δ) = Pn(t) Pr[0 arrival & 0 departure in (t, t + δ)] + Pn−1(t) Pr[1 arrival & 0 departure in (t, t + δ)] + Pn+1(t) Pr[0 arrival & 1 departure in (t, t + δ)] + o(h) = Pn(t)(1 − λδ)(1 − µδ) + Pn−1(t)(λδ)(1 − µδ) + Pn+1(t)(1 − λδ)(µδ) + o(δ). Rearranging and dividing it by δ, Pn(t + δ) − Pn(t) δ = −(λ + µ)Pn(t) + λPn−1(t) + µPn+1(t) + o(δ) δ As δ → 0, for n > 0 we have dPn(t) dt = − (λ + µ) rate out of state n Pn(t) + λ rate from state n − 1 to n Pn−1(t) + µ rate from state n + 1 to n Pn+1(t). 3-41 Example: Barber shop III For n = 0, we have dP0(t) dt = −λP0(t) + µP1(t). As t → ∞, i.e., steady-state, we have Pn(∞) = πn with dPn(t) dt = 0. λπ0 = µπ1 (λ + µ)πn = λπn−1 + µπn+1 for n ≥ 1. State transition rate diagram …… Solution of the above equations is (ρ = λ/µ) πn = ρn π0 and 1 = π0 1 + ∞ i=1 ρi ⇒ π0 = 1 − ρ 3-42 Example: Barber shop IV ρ: the server’s utilization (< 1, i.e., λ < µ) Mean of customers in the system E[N] = ∞ n=0 nπn = ρ 1 − ρ = ρ(in server) + ρ2 /(1 − ρ)(in queue) An M/M/1 system with 1/µ = 1 ρ 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 Numberofcustomersinthesystem 0 5 10 15 20 Simulation Analysis ρ 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 Meansystemresponsetime(sec) 0 5 10 15 20 Simulation Analysis 3-43 Example: Barbershop V Recall the state transition rate matrix, Q on page 3-40 as πQ = 0 with Q =      −v0 γ01 γ02 γ03 . . . γ10 −v1 γ12 γ13 . . . γ20 γ21 −v2 γ23 . . . ... ... ... ... ...      and π · 1 = 1, – What is γij, and vi in M/M/1 queue ? γij =    λ, if j = i + 1, µ, if j = i − 1. −(λ + µ), if j = i 0, otherwise If a and b denote interarrival and service time, respectively, then vi is the mean of an exponential distribution, i.e., min(a, b). What is pi,i+1 or pi+1,i? pi,i+1 = Pr[a < b] = λ/(λ+µ) and pi+1,1 = Pr[b < a] = µ/(λ+µ). 3-44
  • 12. Example: Barbershop VI Distribution of sojourn time, T: TN = S1 + S2 + · · · + SN customers ahead +SN+1 An arriving customer finds N customers in the system (including the customer in the server) – By the memoryless property of the exponential distribution, the remaining service time of the customer in service is exponentially distributed: fT (t) = ∞ i=0 µ (µt)i i! e−µt πi = ∞ i=0 µ (µt)i i! e−µt ρi (1 − ρ) = µ(1 − ρ)e−µ(1−ρ)t which can be obtained via Laplace transform of distribution of Si. 3-45 Barbershop simulation I Discrete event simulation Generate an arrival sim_time = sim_time + interarrival time sim_time = 0 Queue = Queue + 1 Scheduling an eventnext interarrival time < service time Arrival? yes Queue = Queue ─ 1 service time < next interarrival time Queue is empty? Yes sim_time = sim_time + service time No 3-46 Barbershop simulation II clear % Define variables global arrival departure mservice_time arrival = 1; departure = -1; mservice_time = 1; % Set simulation parameters sim_length = 30000; max_queue = 1000; % To get delay statistics system_queue = zeros(1,max_queue); k = 0; for arrival_rate = 0.1:0.025:0.97 k = k + 1; % x(k) denotes utilization x(k) = arrival_rate*mservice_time; % initialize sim_time = 0; num_arrivals = 0; num_system =0; upon_arrival = 0; total_delay = 0; num_served =0; % Assuming that queue is empty event = arrival; event_time = exprnd(1/arrival_rate); sim_time = sim_time + event_time; while (sim_time < sim_length), % If an arrival occurs, if event == arrival num_arrivals = num_arrivals + 1; num_system = num_system + 1; % Record arrival time of the customer system_queue(num_system) = sim_time; upon_arrival = upon_arrival + num_system; % To see whether one new arrival comes or new departure occurs [event, event_time] = schedule_next_event(arrival_rate); 3-47 Barbershop simulation III % If a departure occurs, elseif event == departure delay_per_arrival = sim_time - system_queue(1); system_queue(1:max_queue-1) = system_queue(2:max_queue); total_delay = total_delay + delay_per_arrival; num_system = num_system - 1; num_served = num_served + 1; if num_system == 0 % nothing to serve, schedule an arrival event = arrival; event_time = exprnd(1/arrival_rate); elseif num_system > 0 % still the system has customers to serve [event, event_time] = schedule_next_event(arrival_rate); end end sim_time = sim_time + event_time; end ana_queue_length(k) = (x(k)/(1-x(k))); ana_response_time(k) = 1/(1/mservice_time-arrival_rate); % Queue length seen by arrival sim_queue_length(k) = upon_arrival/sim_length; sim_response_time(k) = total_delay/num_served; end 3-48
  • 13. Barbershop simulation IV function [event, event_time] = schedule_next_event(arrival_rate) global arrival departure mservice_time minter_arrival = 1/arrival_rate; inter_arrival = exprnd(minter_arrival); service_time = exprnd(mservice_time); if inter_arrival < service_time event = arrival; event_time = inter_arrival; else event = departure; event_time = service_time; end 3-49 Relation between DTMC and CTMC I Recall an embedded MC: each time a state, say i, is entered, an exponentially distributed state occupancy time is selected. When the time is up, the next state j is selected according to transition probabilities, ˜pij time 1 2 4 : continuous-time Markov process 3 • Ni(n): the number of times state i occurs in the first n transitions • Ti(j): the occupancy time the jth time state i occurs. The proportion of time spent by X(t) in state i after the first n transitions time spent in state i time spent in all states = Ni (n) j=1 Ti(j) i Ni (n) j=1 Ti(j) 3-50 Relation between DTMC and CTMC II As n → ∞, using πi = Ni(n)/n we have Ni (n) n 1 Ni (n) Ni (n) j=1 Ti(j) i Ni (n) n 1 Ni (n) Ni (n) j=1 Ti(j) = πiE[Ti] i πiE[Ti] E[Ti ]=1/vi = φi, where πi is the unique pmf solution to πj = i πi˜pij and j πj = 1 (∗) The long-term proportion of time spent in state i approaches φi = πi/vi i πi/vi = c πi vi → πi = viφi c Substituting πi = (viφi)/c into (∗) yields viφi c = 1 c i viφi˜pij → viφi = i φivi˜pij = i φiγij 3-51 Relation between DTMC and CTMC III Recall M/M/1 queue 0 1 32 4 0 1 32 4 … … … … a) CTMC b) Embedded MC In the embedded MC, we have the following global balance equations π0 = qπ1 π1 = π0 + qπ2 ... πi = pπi−1 + qπi+1    πi = p q πi−1 3-52
  • 14. Relation between DTMC and CTMC IV Using the normalization condition, ∞ i=0 πi = 1, πi = p q i−1 1 q π0 and π0 = 1 − 2p 2(1 − p) Converting the embedded MC into CTMC, φ0 = c v0 π0 = c λ π0 and φi = cπi vi = c λ + µ πi To determine c, ∞ i=0 φi = 1 → c π0 λ + 1 λ + µ ∞ i=1 πi = 1 → c = 2λ Finally, we get φi = ρi (1 − ρ) for i = 1, 2, . . . 3-53 Example of using embedded MC I A stray dog, in front of the tea shop at the central library of IIT Delhi, spends most of the daytime sleeping around the tea shop. When a person comes to the tea shop, the dog greets him or her and wags her tail for an average time of one minute. At the end of this period, this dog is fed with probability 1/4, patted briefly with probability 5/8, or taken for a walk with probability 1/8. If fed, she spends an average of two minutes eating. The walks take 15 minutes on average. After eating, being patted, or walking, she returns to sleep. Assume that people come to the tea shop on average every hour. 1. Find a Markov chain model with four states, {sleep, greet, eat, walk}: Specify the transition probabilities and rates 2. Find the steady state probabilities that you find the dog’s state 3-54 Example of using embedded MC II • State transition diagram sleep greet eat walk 1 1 1/4 1/8 5/8 P = S G E W       S 0 1 0 0 G 5/8 0 1/4 1/8 E 1 0 0 0 W 1 0 0 0 • From πP = π and π · 1 = 1, we have π0 = π1 = 8/19, π2 = 2/19, π3 = 1/19 • From φi = (πi/vi)/( k πk/vk), we have, e.g., φ0 = 60π0 60π0 + π1 + 2π2 + 15π3 and φ2 = 2π2 60π0 + π1 + 2π2 + 15π3 3-55 Queueing systems I The arrival times, the size of demand for service, the service capacity and the size of waiting room may be (random) variables. Queueing discipline: specify which customer to pick next for service. • First come first serve (FCFS, or FIFO) • Last come first serve (LCFS, LIFO) • Random order, Processor sharing (PS), Round robin (RR) • Priority (preemptive:resume, non-resume; non-preemptive) • Shortest job first (SJF) and Longest job first (LJF) 3-56
  • 15. Queueing systems II Customer behavior: jockeying, reneging, balking, etc. Kendall’s notation: Population size (default ) Queue size (default ) # of servers Service time distribution Arrival time distribution For A and B: • M: Markovian, exponential dist. • D: Deterministic • GI: General independent • Ek: Erlang-k • Hk: Mixture of k exponentials • PH: Phase type distribution E.g.: M/D/2, M/M/c, G/G/1, etc.; Barbershop is M/M/1 queue. 3-57 Queueing system III Performance measure: • N(t) = Nq(t) + NS(t): number in system • Nq(t): number in queue • NS(t): number in service • W : Waiting time in queue • T: total time (or response time) in the system • τ: service time • Throughput: γ mean # of customers served per unit time 1. γ for non-blocking system = min(λ, mµ) 2. γ for a blocking system = (1 − PB)λ, PB = blocking probability • Utilization: ρ fraction of time server is busy ρ = load capacity = lim T→∞ λT µT = λ µ for a single server queue = lim T→∞ λT mµT = λ mµ for an m-server queue 3-58 Little’s theorem I Any queueing system in steady state: N = λT 1 2 4 3 5 6 Numberofarrivalsordepartures time Customer 1 Customer 2 • N: average number of customers in the system • λ: steady-state arrival rate, need not to be a Poisson • T: average delay per customer Proof: For a system with N(0) = 0 and N(t) = 0, as t → ∞ Nt = 1 t t 0 N(τ)dτ = 1 t α(t) i=1 Ti = α(t) t α(t) i=0 Ti α(t) = λt · Tt. If N(t) = 0, we have β(t) t β(t) i=0 Ti β(t) ≤ Nt ≤ λtTt. 3-59 Little’s theorem II As an alternative, for the cumulative processes, N(t) = α(t) − β(t) = γ(t) −→ divided by t N(t)/t = γ(t)/t = Nt See the variable, ‘num_system’ in the previous Matlab code ‘num_arrvials’ in the code (t corresponds to ‘sim_length’) λt = α(t)/t Response time per customer from ‘total_delay’ Tt = γ(t) α(t) = γ(t) t · t α(t) = Nt λt As t → ∞, we have λT = λ(W + x) = Nq + ρ valid for any queue (even with any service order) as long as the limits of λt and Tt exist as t → ∞ 3-60
  • 16. Little’s theorem III Finite queue … … Network of queues 3-61 Increasing the arrival and transmission rates by the same fator In a packet transmission system, • Arrival rate (packets/sec) is increased from λ to Kλ for K > 1 • The packet length distribution remains the same (exponential), with mean 1/µ bits • The transmission capacity (C bps) is increased by a factor of K Performance • The average number of packets in the system remain the same N = ρ 1 − ρ with ρ = λ/(µC) • Average delay per packet λW = N → W = N/(Kλ) Aggregation is better: increasing a transmission line by K times can allow K times as many packets/sec with K times smaller average delay per packet 3-62 Statistical multiplexing vs TDMA or FDMA Multiplexing: m Poisson packet streams each with λ/m (packets/sec) are transmitted over a communication link with 1/µ exponentially distributed packet transmission time … … a) Statistical multiplexing b) TDMA or FDMA T = 1 µ − λ < T = m µ − λ When do we need TDMA or FDMA? – In a multiplexer, packet generation times overlap, so that it must buffer and delay some of the packets 3-63 Little’s theorem: example I Estimating throughput in a time-sharing systemSec. 3.2 Queueing Models-Little's Theorem Average reflection time R Computer B Average job processing time P C 161 (3.6) Figure 3.4 N terminals connected with a time-sharing computer system. To estimate maximum attainable throughput, we assume that a departing user im- mediately reenters the system or, equivalently, is immediately replaced by a new user. Combining this relation with A = NIT [cf. Eq. (3.3)], we obtain N -R+P The throughput A is also bounded above by the processing capacity of the computer. In particular, since the execution time of a job is P units on the average, it follows that the computer cannot process in the long run more than II P jobs per unit time, that is, I A <--P (3.7) (3.8) (This conclusion can also be reached by applying Little's Theorem between the entry and exit points of the computer's CPU.) By combining the preceding two relations, we obtain the bounds N {I N}---::-:-:c- < A < min - --- R+NP- - P'R+P for the throughput A. By using T = N I A, we also obtain bounds for the average user delay when the system is fully loaded: max {NP, R+ P} -s; T -s; R+ NP (3.9) Suppose a time-sharing computer system with N terminals. A user logs into the system through a terminal and after an initial reflection period of average length R, submit a job that requires an average processing time P at the computer. Jobs queue up inside the computer and are served by a single CPU according to some unspecified priority or time-sharing rule. What is the maximum of sustainable throughput by the system? – Assume that there is always a user ready to take the place of a departing user, so the number of users in the system is always N 3-64
  • 17. Little’s theorem: example II The average time a user spends in the system T = R + D → R + P ≤ T ≤ R + NP – D: the average delay between time time a job is submitted to the computer and the time its execution is completed, D = [P, NP] Combining this with λ = N/T, N R + NP ≤ λ ≤ min 1 P , N R + P – throughput is bounded by 1/P, maximum job execution rate 162 ,< ....::J 0. .r:: Cl ::J o 1: I- '"::c '"c :t 11P Bound induced by Iimited number of terminals Delay Models in Data Networks Bound induced by CPU processing capacity Guaranteed throughput curve Chap. 3 Upper bound for delay E '"1;; > en '".sc '"E i= :::> '"Cl :;> '"> <{ o 1 + RIP R+P /1 R / I I I 1/ V //1 0 Number of Terminals N (a) Lower bound for delay due to limited CPU processing capacity Delay assuming no waiting in queue Number of Terminals N (b) Figure 3.5 Bounds on throughput and average user delay in a time-sharing system. (a) Bounds on attainable throughput [Eq. (3.8)]. (b) Bounds on average user time in a fully loaded system [Eq. (3.9)]. The time increases essentially in proportion with the number of terminals N. bounds obtained are independent of these parameters. We owe this convenient situation to the generality of Little's Theorem. 3.3 THE M / M /1 QUEUEING SYSTEM The M / ][/ I queueing system consists of a single queueing station with a single server (in a communication context, a single transmission line). Customers arrive according to a Poisson process with rate A, and the probability distribution of the service time is exponential with mean 1/f.1 sec. We will explain the meaning of these terms shortly. The name AI/AI/ I reflects standard queueing theory nomenclature whereby: 3-65 Little’s theorem: example III Using T = N/λ, we can rewrite max{NP, R + P} ≤ T ≤ R + NP ,< ....::J 0. .r:: Cl ::J o 1: I- '"::c '"c :t 11P Bound induced by Iimited number of terminals Bound induced by CPU processing capacity Guaranteed throughput curve Upper bound for delay E '"1;; > en '".sc '"E i= :::> '"Cl :;> '"> <{ o 1 + RIP R+P /1 R / I I I 1/ V //1 0 Number of Terminals N (a) Lower bound for delay due to limited CPU processing capacity Delay assuming no waiting in queue Number of Terminals N (b) Figure 3.5 Bounds on throughput and average user delay in a time-sharing system. (a) Bounds on attainable throughput [Eq. (3.8)]. (b) Bounds on average user time in a fully loaded system [Eq. (3.9)]. The time increases essentially in proportion with the number of terminals N. bounds obtained are independent of these parameters. We owe this convenient situation to the generality of Little's Theorem. 3.3 THE M / M /1 QUEUEING SYSTEM The M / ][/ I queueing system consists of a single queueing station with a single server (in a communication context, a single transmission line). Customers arrive according to a Poisson process with rate A, and the probability distribution of the service time is exponential with mean 1/f.1 sec. We will explain the meaning of these terms shortly. The name AI/AI/ I reflects standard queueing theory nomenclature whereby: 1. The first letter indicates the nature of the arrival process [e.g., !vI stands for mem- oryless, which here means a Poisson process (i.e., exponentially distributed inter- 3-66 Poisson Arrivals See Time Average (PASTA) theorem I Suppose a random process which spends its time in different states Ej In equilibrium, we can associate with each state Ej two different probabilities • The probability of the state as seen by an outside random observer – πj: prob. that the system is in the state Ej at a random instant • The probability of the state seen by an arriving customer – π∗ j : prob. that the system is in the state Ej just before (a randomly chosen) arrival In general, we have πj = π∗ j When the arrival process is Poisson, we have πj = π∗ j 3-67 PASTA theorem II For a stochastic process, N ≡ {N(t), t ≥ 0} for t ≥ 0 and an arbitrary set B ∈ N: U(t) = 1, if N(t) ∈ B, 0, otherwise. ⇒ V (t) = 1 t t 0 U(τ)dτ. For a Poisson arrival process A(t), Y (t) = t 0 U(τ)dA(τ) ⇒ Z(t) = Y (t)/A(t) Lack of Anticipation Assumption (LAA): For each t ≥ 0, {A(t + u) − A(t), u ≥ 0} and {U(s), 0 ≤ s ≤ t} are independent: Future inter-arrival times and service times of previously arrived customers are independent. Under LAA, as t → ∞, PASTA ensures V (t) → V (∞) w.p. 1 if Z(t) → V (∞) w.p.1 3-68
  • 18. PASTA theorem Proof: • For sufficiently large n, Y (t) is approximated as Yn(t) = n−1 k=0 U(k(t/n))[A((k + 1)t/n) − A(kt/n) (λ(k+1)t−λkt)/n ] • LAA decouples the above as E[Yn(t)] = λtE n−1 k=0 U(kt/n)/n • As n → ∞, if |Yn(t)| is bounded, lim n→∞ E[Yn(t)] = E[Y (t)] = λtE[V (t)] = λE t 0 U(τ)dτ . : the expected number of arrivals who find the system in state B equals arrival rate times the expected length of time it is there. 3-69 Systems where PASTA does not hold Ex1) D/D/1 queue • Deterministic arrivals every 10 msec • Deterministic service times of 9 msec 0 9 10 19 … … 20 A sample path of D/D/1 queue • Arrivals always finds the system empty. • The system is occupied on average with 0.9. Ex2) LAA violated: Service times for a current customer depends on an inter-arrival time of a future customer • Your own PC (one customer, one server) • Your own PC is always free when you need it, π∗ 0 = 1 • π0= proportion of time the PC is free (< 1) 3-70 M/M/1/K I M/M/1/K: the system can accommodate K customers … … waiting customers • State balance equations λπ0 = µπ1 (λ + µ)πi = λπi−1 + µπi+1 for 1 ≤ i ≤ K After rearranging, we have λπi−1 = µπi for 1 ≤ i ≤ K • For i ∈ {0, 1, . . . , K}, steady-state probabilities are πn = ρn π0 and K n=0 πn = 1 ⇒ π0 = 1 − ρ 1 − ρK+1 3-71 M/M/1/K II • πK : the probability that an arriving customer finds the system full. Due to PASTA, this is a blocking probability πK = 1 − ρ 1 − ρK+1 ρK • Blocking probability in simulation PB = total # of blocked arrivals upon arrival instants total # of arrivals at the system ρ 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9PB 0 0.025 0.05 0.075 0.1 0.125 0.15 Analysis, K = 10 Simulation, K = 10 Analysis, K = 5 Simulation, K = 5 3-72
  • 19. M/M/1/K Simulation I clear % Define variables global arrival departure mservice_time arrival = 1; departure = -1; mservice_time = 1; % Define simulation parameters sim_length = 30000; K = 10; system_queue = zeros(1,K); k = 0; max_iter = 5; for arrival_rate = 0.1:0.025:0.97 k = k + 1; x(k) = arrival_rate*mservice_time; % initialize sim_time=0; num_arrivals=0; num_system=0; upon_arrival=0; total_delay=0; num_served=0; dropped=0; % Assuming that queue is empty event = arrival; event_time = exprnd(1/arrival_rate); sim_time = sim_time + event_time; for iter = 1:max_iter while (sim_time < sim_length), % If an arrival occurs, if event == arrival num_arrivals = num_arrivals + 1; if num_system == K dropped = dropped + 1; else num_system = num_system + 1; system_queue(num_system) = sim_time; upon_arrival = upon_arrival + num_system; end % To see whether one new arrival comes or new departure occurs [event, event_time] = schedule_next_event(arrival_rate); 3-73 M/M/1/K Simulation II % If a departure occurs, elseif event == departure delay_per_arrival = sim_time - system_queue(1); system_queue(1:K-1) = system_queue(2:K); total_delay = total_delay + delay_per_arrival; num_system = num_system - 1; num_served = num_served + 1; if num_system == 0 % nothing to serve, schedule an arrival event = arrival; event_time = exprnd(1/arrival_rate); elseif num_system > 0 % still the system has customers to serve [event, event_time] = schedule_next_event(arrival_rate); end end sim_time = sim_time + event_time; end Pd_iter(iter)=dropped/num_arrivals; end piK(k) = x(k)^K*(1-x(k))./(1-x(k)^(K+1)); Pd(k) = mean(Pd_iter); end %%%%%%%%%%% %% use the previous schedule_next_event function 3-74 M/M/m queue I M/M/m: there are m parallel servers, whose service times are exponentially distributed with mean 1/µ. …… … State transition rate diagram of M/M/m When m servers are busy, the time until the next departure, X, is X = min(τ1, τ2, . . . , τm) ⇒ Pr[X > t] = Pr[min(τ1, τ2, . . . , τm) > t] = m i=1 Pr[τi > t] = e−mµt (i.i.d.) Global balance equations: λπ0 = µπ1 (λ + min(n, m)µ)πn = λπn−1 + min(n + 1, m)µπn+1 for n ≥ 1 3-75 M/M/m queue II The previous global balance equation can be rewritten as λπn−1 = min(n, m)µπn for n ≥ 0 Using a = λ/µ and ρ = λ/mµ πn = ρmax(0,n−m) am m! π0 From the normalization condition, π0 is obtained 1 = ∞ i=0 πi = π0 m−1 i=0 ai i! + am m! ∞ i=m ρi−m Erlang C formula, C(m, a), C(m, a) = Pr[W > 0] = Pr[N ≥ m] = ∞ i=m πi = (mρ)m m! · π0 1 − ρ 3-76
  • 20. M/M/c/c I c-server and only c customers can be accommodated … … Balance equations are (a = λ/µ called Erlang) λπn−1 = nµπn ⇒ πn = a n πn−1 = an n! π0 Using c n=0 πn = 1, we have πn = an n! c i=0 ai i! −1 Erlang B formula: B(c, a) = πc – valid for M/G/c/c system. Note that this depends only on the mean of service time distribution 3-77 M/M/c/c II Erlang capacity: Telephone systems with c channels offered traffic intensity, a 10-1 100 101 B(c,a) 10-4 10-3 10-2 10-1 100 c = 1 2 3 4 5 6 7 8 9 10 offered traffic intensity, a 0 20 40 60 80 100 B(c,a) 10-4 10-3 10-2 10-1 100 10 20 30 40 50 60 70 80 90 100 a 0 0.5 1 1.5 2 2.5 3 PB 10-8 10-7 10-6 10-5 10-4 10-3 10-2 10-1 100 Analysis, c = 3 Simulation, c = 3 Analysis, c = 5 Simulation, c = 5 3-78 M/M/c/c Simulation I clear global arrival departure mservice_time arrival = 1; departure = -1; mservice_time = 1; sim_length = 50000; n_iter = 5; K = 5; % number of servers k = 0; for arrival_rate = 0.05:0.025:0.95 k = k + 1; for iter = 1:n_iter sim_time = 0; num_busy_servers =0; block = 0; num_arrival = 0; event = arrival; event_time = exprnd(1/arrival_rate); while (sim_time < sim_length), if event == arrival num_arrival = num_arrival + 1; %% All servers are working, if num_busy_servers == K block = block + 1; end %% increase the number of busy servers by 1 num_busy_servers = min(K,num_busy_servers + 1); [event, event_time] = schedule_next_event_multi(arrival_rate,num_busy_servers); 3-79 M/M/c/c Simulation II elseif event == departure num_busy_servers = num_busy_servers - 1; if num_busy_servers == 0 event = arrival; event_time = exprnd(1/arrival_rate); else [event, event_time] = schedule_next_event_multi(arrival_rate,num_busy_servers); end end sim_time = sim_time + event_time; end simb(iter) = block/num_arrival; end rho = arrival_rate/mservice_time; x(k) = rho; anab(k) = (rho^K/factorial(K))/(sum((rho.^[0:K])./factorial([0:K]))); Pb(k) = mean(simb); end %%%%%%%%%%%% A new function starts here %%%%%%%%%%%% function [event, event_time] = schedule_next_event_multi(arrival_rate,num_busy_servers) global arrival departure mservice_time inter_arrival = exprnd(1/arrival_rate); multi_service = exprnd(mservice_time,[1 num_busy_servers]); service_time = min(multi_service); if inter_arrival < service_time event = arrival; event_time = inter_arrival; else event = departure; event_time = service_time; end 3-80
  • 21. Example: a system with blocking I In Select-city shopping mall, customers arrive at the underground parking lot of it according to a Poisson process with a rate of 60 cars per hour. Parking time follows a Weibull distribution with mean 2.5 hours and the parking lot can accommodate 150 cars. When the parking lot is full, an arriving customer has to park his car somewhere else. Find the fraction of customers finding all places occupied upon arrival x (hours) 0 1 2 3 4 5 6 7 8 f(x) 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 Weibull: α = 2.7228, k = 5 f(x) = k α k α k−1 e(x α ) k f(x) = 1 α e− x α exponential two different distributions with the same mean – Mean of Weibull distribution: αΓ(1 + 1/k), and Γ(x) = ∞ 0 tx−1e−tdt is called the gamma function 3-81 Example: a system with blocking II • c = 150 and a = λ/µ = 60 × 2.5 = 150 B(c, a) c=150,a=150 = ac c! c i=0 ai i! • Divide the numerator and denominator by c−1 n=0 an /n!, B(c, a) = ac c! c−1 i=0 ai i! + ac/c! = (ac /c!)/ c−1 n=0 an /n! 1 + (ac/c!)/ c−1 n=0 an/n! = (a/c)B(c − 1, a) 1 + (a/c)B(c − 1, a) = aB(c − 1, a) c + aB(c − 1, a) with B(0, a) = 1 3-82 Finite source population: M/M/C/C/K system I Consider the loss system (no waiting places) in the case where the arrivals originate from a finite population of sources: the total number of customers is K … … 1 2 1 2 • The time to the next call attempt by a customer, so called thinking time (idle time) of the customer obeys an exponential distribution with mean 1/λ (sec) • Blocked calls are lost - does not lead to reattempts; starts a new thinking time, again. The time to the next attempt is also the same exponential distribution with 1/λ - the call holding time is exponentially distributed with 1/µ 3-83 M/M/C/C/K system II If C ≥ K, each customer has its own server, i.e., no blocking. • Each user shows two-state, active with mean 1/µ and idle with mean 1/λ • The probability for a user to be idle or active is π0 = 1/λ/(1/λ + 1/µ) and π1 = 1/µ/(1/λ + 1/µ), • Call arrival rate: π0λ, offered load: π1 = a/(1 + a), and a = λ/µ If C < K, this system can be described as … ((K − i)λ + iµ)πi = (K − (i − 1))πi−1 + (i + 1)µπi+1 3-84
  • 22. M/M/C/C/K system III • For j = 1, 2, . . . , K, we have (C − j + 1)πj−1 = jµπj ⇒ πj = K j aj π0. • Applying K j=0 πj = 1, πj = K j aj / C k=0 K k ak Time blocking (or congestion): the proportion of time the system spends in the state C; the equilibrium probability of the state C is PB = πC – The probability of all resources being busy in a given observational period – Insensitivity: Like Erlang B formula, this result is insensitive to the form of the holding time distribution (though the derivation above was explicitly based on the assumption of exponential holding time distribution) 3-85 M/M/C/C/K system IV Call blocking: the probability that an arriving call is blocked, i.e., PL • Arrival rate is state-dependent, i.e., (K − N(t))λ: Not Poisson. • PASTA does not hold: Time blocking, PB can’t represent PL • λT : Call arrivals on average λT ∝ C i=0 (K − i)λπi – PL: the probability that a call finds the system blocked – If λT = 10000 and PL = 0.01, λT PL = 100 calls are lost • λC : Call arrivals when the system is blocked λC ∝ (K − C)λ –PBλC : blocked calls upon the arrival instant PLλT = PBλC – Among total arrivals, some of them that find the system blocked should be equal to call arrivals of seeing the busy system 3-86 M/M/C/C/K system V • Call blocking PL can be obtained by PLλT = PBλC → PL = λC λT PB ≤ PB • Engset formula: PL(K) = (K − C)λπC C i=0(K − i)λπi = (K − C) K! C!(K−C)! ac C i=0(K − i) K! i!(K−i)! ac = (K−1)! C!(K−1−C)! ac C i=0 (K−1)! i!(K−1−i)! ac = K − 1 C ac / C i=0 K − 1 i ai – The state distribution seen by an arriving customer is the same as the equilibrium distribution in a system with one less customer. It is as if the arriving customer were an "outside observer" – PL(K) = PB(K − 1): as K → ∞, PL → PB 3-87 Probability generating function For a discrete random variable X with gk = Pr[X = k], the PGF is defined as G(z) = E[zX ] = k zk gk where z is a complex variable, and gk = 1 k! dk G(z)/dzk at z = 0. For |z| ≤ 1, G(z) is convergent G(z) ≤ k |zk ||gk| ≤ k gk = 1 G(z) is analytic for |z| < 1, if • differentiable infinitely often in that domain, or • G(z) is expressed as a power series, i.e., k zk gk 3-88
  • 23. Bulk queues: Bulk arrival I An arrival of i customers with gi = Pr[bluk size = i] …… …… … …… … Global balance equations λπ0 = µπ1 (λ + µ)πk = µπk+1 + k−1 i=0 πiλgk−i for k ≥ 1 Using the definition of PGF, i.e., Π(z) = ∞ k=0 zk πk, (λ + µ) ∞ k=1 πkzk = µ z ∞ k=1 πk+1zk+1 + ∞ k=1 k−1 i=0 πiλgk−izk 3-89 Bulk queues: Bulk arrival II The term, ∞ k=1 k−1 i=0 πigk−izk , can be written as k = 1, π0g1z k = 2, (π0g2 + π1g1)z2 k = 3, (π0g3 + π1g2 + π2g1)z3 ... k = i, (π0gi + π1gi−1 + · · · + πi−1g1)zi which yields π0G(z) + π1zG(z) + π2z2 G(z) + · · · = Π(z)G(z) and G(z) = ∞ k=1 gkzk Substituting this into the previous eqn., (λ + µ)(Π(z) − π0) = µ z (Π(z) − π0 − zπ1 λ/µπ0=π1 ) + λΠ(z)G(z) 3-90 Bulk queues: Bulk arrival III After some manipulations, we have Π(z) = µπ0(1 − z) µ(1 − z) − λz[1 − G(z)] = N(z) D(z) To determine π0, we use Π(1) = 1 Π(1) = N(1) D(1) = 0 0 −→ L’Hopitals rule N (1) D (1) = 1, which yields π0 = 1 − λG (1)/µ The mean number of customers in the system N = Π (1) = N (z)D(z) − N(z)D (z) (D(z))2 , where L’Hopital’s rule should be applied again 3-91 Bulk queues: Bulk service I Serve a group of size r: some variations are possible 1 20 … … … λπ0 = µ(π1 + π2 + · · · + πr ) (λ + µ)πk = µπk+r + λπk−1 for k ≥ 1 Using the definition of PGF, (λ + µ)[Π(z) − π0] = µ zr Π(z) − r k=0 πkzk + λzΠ(z) Solving for Π(z), we have Π(z) = µ r k=0 πkzk − (λ + µ)π0zr λzr+1 − (λ + µ)zr + µ 3-92
  • 24. Bulk queues: Bulk service II The term, λπ0 = µ(π1 + π2 + · · · + πr ), can be modified as −zr (λπ0 + µπ0) = −zr µ(π0 + π1 + π2 + · · · + πr ) We can rewrite Π(z) as Π(z) = r−1 k=0 πk(zk − zr ) rρzr+1 − (1 + rρ)zr + 1 with ρ = λ/(µr) – Can we determine πk for k ∈ {0, 1, . . . , r} using Π(1) = 1? Rouche’s theorem If f (z) and g(z) are analytic functions of z inside and on a closed contour C and also if |g(z)| < |f (z)| on C, then f (z) and f (z) + g(z) have the same number of zeros inside C • Let f (z) = −(1 + rρ)zr and g(z) = rρzr+1 + 1 • On the closed contour C with the origin of radius 1 + δ, |z| = 1 + δ, |f (z)| = | − (1 + rρ)zr | = (1 + rρ)(1 + δ)r |g(z)| = |rρzr+1 + 1| ≤ rρ(1 + δ)r+1 + 1 3-93 Bulk queues: Bulk service III • On the contour C, |f (z)| − |g(z)| ≥ (1 + rρ)(1 + δ)r − rρ(1 + δ)r+1 − 1 = (1 + δ)r (1 − rρδ) − 1 ≥ (1 + rδ)(1 − rρδ) − 1 (use (1 + δ)r ≥ 1 + rδ) = rδ(1 − ρ − rρδ) > 0, 0 < δ < (1 − ρ)/(rρ) • Letting δ → 0, the denominator has r roots for |z| ≤ 1 • The denominator of degree r + 1 has one additional root for |z| > 1 Since Π(z) is analytic for |z| ≤ 1, r roots in the denominator must be canceled out in the numerator (Otherwise, Π(z) is not analytic) – We can rewrite the numerator as r−1 k=0 πk(zk − zr ) = K(z − 1) r−1 k=1 (z − z∗ k ) where K is a proportionality constant and z∗ k is the root inside the unit disk 3-94 Bulk queues: Bulk service IV By canceling the roots inside and on the unit disk in the numerator and denominator, we have Π(z) = K(1 − z) r−1 k=1(z − z∗ k ) rρzr+1 − (1 + rρ)zr + 1 = K 1 − z/z0 – Using Π(1) = 1, we have K = 1 − 1/z0 – πk = (1 − 1/z0)(1/z0)k for k = 0, 1, 2 . . . Our system with bulk service becomes M/M/1 queue if r = 1 • Comparison with M/M/1 queue πi = (1 − ρ)ρi for i = 0, 1, 2, . . . → Π(z) = 1 − ρ 1 − ρz where we find z0 = 1/ρ > 1 3-95 Bulk queues: Bulk service V Real roots outside the unit disk z 1 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2 2.1 2.2 D(z) -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 1/0.75 = 1.333 D(z) = rρzr+1 − (1 + rρ)zr + 1, ˆρ = λ/µ, ρ = ˆρ/r ˆρ = 0.75, r = 1 ˆρ = 0.75, r = 2 ˆρ = 0.75, r = 3 ˆρ = 0.95, r = 3 • For r = 1, we have M/M/1 queue, z0 = 1/˜ρ • Mean number of customers in the system L = Π (1) = 1/(z0 − 1) – As r increases given a fixed ˜ρ, z0 increases → L decreases – As ˜ρ increases given a fixed r, z0 gets closer to 1 → L increases 3-96
  • 25. Where are we? Elementary queueing models – M/M/1, M/M/C, M/M/C/C/K, ... and bulk queues – either product-form solutions or use PGF Intermediate queueing models (product-form solution) – Time-reversibility of Markov process – Detailed balance equations of time-reversible MCs – Multidimensional Birth-death processes – Network of queues: open- and closed networks Advanced queueing models – M/G/1 type queue: Embedded MC and Mean-value analysis – M/G/1 with vacations and Priority queues – G/M/m queue More advanced queueing models (omitted) – Algorithmic approaches to get steady-state solutions 3-97 Time Reversibility of discrete-time MC I For an irreducible, aperiodic, discrete-time MC, (Xn, Xn+1,...) having transition probabilities pij and stationary distribution πi for all i: Time-reversed MC is defined as X∗ n = Xτ−n for an arbitrary τ > 0 Forward process Time reversed process 1) Transition probabilities of X∗ n p∗ ij = πjpji πi 2) Xn and X∗ n have the same stationary distribution πi with 1) and if ∞ j=0 pij = ∞ j=0 p∗ ij 3-98 Time Reversibility of discrete-time MC II • Proof for 1) p∗ ij = πjpji/πi: p∗ ij = Pr[Xm = j|Xm+1 = i, Xm+2 = i2, . . . , Xm+k = ik] = Pr[Xm = j, Xm+1 = i, Xm+2 = i2, . . . , Xm+k = ik] Pr[Xm+1 = i, Xm+2 = i2, . . . , Xm+k = ik] = Pr[Xm = j, Xm+1 = i] Pr[Xm+2 = i2, . . . , Xm+k = ik|Xm = j, Xm+1 = i] Pr[Xm+1 = i] Pr[Xm+2 = i2, . . . , Xm+k = ik|Xm+1 = i] = Pr[Xm = j, Xm+1 = i] Pr[Xm+1 = i] = Pr[Xm+1 = i|Xm = j] Pr[Xm = j] Pr[Xm+1 = i] = pjiπj πi • Proof for 2) Using the above result, i∈S πip∗ ij = i∈S πi(πjpji/πi) = πj 3-99 Time Reversibility of discrete-time MC III A Markov process, Xn, is said to be reversible, if – the transition probabilities of the forward and reversed chains are the same, p∗ ij = Pr[Xm = j|Xm+1 = i] = pij = Pr[Xm+1 = j|Xm = i] • Time reversibility ⇔ Detailed balanced equations (DBEs) hold πip∗ ij = πjpji → πipij = πjpji (detailed balance eq.) What types of Markov processes satisfy this detailed balance equation? discrete-time Birth-death (BD) process • Transition occurs between neighboring states: pij = 0 for |i −j| > 1 0 1 2 … … 3-100
  • 26. Time Reversibility of discrete-time MC IV A transmitter’s queue with stop-and-wait ARQ (θ = qr) in Mid-term I • Is this process reversible? 0 1 2 … …… … • Global balance equations (GBEs) π0 =(1 − p)π0 + (1 − p)θπ1 π1 =pπ0 + (pθ + (1 − p)(1 − θ))π1 + (1 − p)θπ2 For i = 2, 3, . . ., we have πi =p(1 − θ)πi−1 + (pθ + (1 − p)(1 − θ))πi + (1 − p)θπi+1 • Instead, we can use DBEs, or simplify GBEs using DBEs, e.g., p(1 − θ)πi = (1 − p)θπi+1 ↔ n j=0 ∞ i=n+1 πjpji = n j=0 ∞ i=n+1 πipij 3-101 Time Reversibility of discrete-time MC V Kolmogorov Criteria • A discrete-time Markov chain is reversible if and only if pi1i2 pi2i3 · · · pin−1in pini1 = pi1in pinin−1 · · · pi3i2 pi2i1 for any finite sequence of states, i1, i2,. . . ,in and any n Proof: • For a reversible chain, if detailed balance eqns. hold, we have 10 3 2 • Fixing two states, i1 = i, and in = j and multiplying over all states, pi,i2 pi2i3 · · · pin−1jpji = pijpjin−1 · · · pi3i2 pi2i 3-102 Time Reversibility of discrete-time MC VI • From the Kolmogorov criteria, we can get pi,i2 pi2i3 · · · pin−1jpji = pijpjin−1 · · · pi3i2 pi2i p (n−1) ij pji = pijp (n−1) ji As n → ∞, we have lim n→∞ p (n−1) ij pji = lim n→∞ pijp (n−1) ji → πjpji = πipij Inspect whether the following two-state MC is reversible P = 0 1 0.5 0.5 – It is a small BD process – Using state probabilities, π0 = 1/3 and π1 = 2/3, π0p01 = 1 3 · 1 = π1p10 = 2 3 · 1 2 3-103 Time Reversibility of discrete-time MC VII Inspect whether the following three-state MC is reversible P =   0 0.6 0.4 0.1 0.8 0.1 0.5 0 0.5   • Using Kolmogorov criteria, p12p23p31 = 0.6 × 0.1 × 0.5 = p13p32p21 = 0.4 × 0 × 0.1 = 0 • Inspecting state transition diagram, it is not a BD process If the state transition diagram of a Markov process is a tree, then the process is time reversible – A generalization of BD processes: at the cut boundary, DBE is satisfied 3-104
  • 27. Continuous-time reversible MC I For a continuous-time MC, X(t), whose stationary state probability πi, we have a discrete-time embedded Markov chain whose stationary pmf and a state transition probability are πi and ˜pij. Embedded Markov process Forward Process Reverse Process time There is a reversed embedded MC with πi˜pij = πj˜p∗ ji for all i = j. 0 1 32 4 0 1 32 4 … … … … CTMC Embedded MC (BD process) 3-105 Continuous-time reversible MC II Recall the state occupancy time of the forward process Pr[Ti > t + s|Ti > t] = Pr[Ti > s] = e−vi s If X(t) = i, the probability that the reversed process remains in state i for an additional s seconds is Pr[X(t ) = i, t − s ≤ t ≤ t|X(t) = i] = e−vi s Embedded Markov process Forward Process Reverse Process time 3-106 Continuous-time reversible MC III A continuous-time MC whose stationary probability of state i is θi, and state transition rate from j to i is γji has a reversible MC whose state transition rate is γ∗ ij, if we find γ∗ ij of satisfying γ∗ ij = vi˜p∗ ij = vi πj˜pji πi ˜pji =γji /vj = vi πjγji πivj from embedded MC = θjγji/θi – ˜p∗ ij(= ˜pij): state transition probability of the reversed embedded MC – Continuous-time MC whose state occupancy times are exponentially distributed is reversible if its embedded MC is reversible Additionally, we have vj = v∗ j i=j θiγ∗ ij γ∗ ij =θj γji /θi = θj i=j γji = θjvj = θjv∗ j ⇒ j=i γij = j=i γ∗ ij 3-107 Continuous-time reversible MC IV Detailed balance equation holds for continuous-time reversible MCs θjγji (input rate to i) = θiγij (output rate from i) for j = i + 1 – Birth-death systems with γij = 0 for |i − j| > 1 – Since the embedded MC is reversible, πi˜pij = πj˜pji → (viθi/c)˜pij = (vjθj/c)˜pji → θiγij = θjγji If there exists a set of positive numbers θi, that sum up to 1 and satisfy θiγij = θjγji for i = j then, the MC is reversible and θi is the unique stationary distribution – Birth and death processes, e.g., M/M/1, M/M/c, M/M/∞ Kolmogorov criteria for continuous time MC – A continuous-time Markov chain is reversible if and only if γi1i2 γi2i3 · · · γini1 = γi1in γinin−1 · · · γi3i2 γi2i1 – Proof is the same as in the discrete-time reversible MC 3-108
  • 28. M/M/2 queue with heterogeneous servers I Servers A and B with service rates µA and µB. When the system empty, arrivals go to A with probability p and to B with probability 1 − p. Otherwise, the head of the queue takes the first free server 0 1A 2B 2 3 Under what condition is this system time-reversible? • For n = 2, 3, . . ., πn = π2 (λ/(µA + µA)) n−2 • Global balance equations along the cuts λπ0 = µAπ1,A + µBπ1,B (µA + µB)π2 = λ(π1,A + π1,B) (µA + λ)π1,A = pλπ0 + µBπ2 3-109 M/M/2 queue with heterogeneous servers II After some manipulations, π1,A = π0 λ µA λ + p(µA + µB) 2λ + µA + µB π2,A = π0 λ µB λ + (1 − p)(µA + µB) 2λ + µA + µB π2 = π0 λ2 µAµB λ + (1 − p)µA + pµB 2λ + µA + µB π0 can be determined by π0 + π1,A + π2,B + ∞ n=2 πn = 1 • If it is reversible, use detailed balance equations (1/2)λπ0 = µAπ1,A → π1,A = 0.5(λ/µA)π0 (1/2)λπ0 = µBπ1,B → π1,B = 0.5(λ/µB)π0 π2 = 0.5λ2 µAµB π0 3-110 Multidimensional Markov chains I Suppose that X1(t) and X2(t) are independent reversible MCs • Then, X(t) = (X1(t), X2(t)) is a reversible MC • Two independent M/M/1 queue, where arrival and service rates at queue i are λi and µi – (N1(t), N2(t)) forms an MC Example: Two Independent M/M/1 Queues Stationary distribution: Detailed Balance Equations: Verify that the Markov chain is reversible – Kolmogorov criterion 1 2 1 1 2 2 1 2 1 1 2 2 ( , ) 1 1 n n p n n λ λ λ λ µ µ µ µ       = − −            1 1 2 1 1 2 2 1 2 2 1 2 ( 1, ) ( , ) ( , 1) ( , ) p n n p n n p n n p n n µ λ µ λ + = + = 2λ 2µ2λ 2µ 2λ 2µ 2λ 2µ 2λ 2µ2λ 2µ 2λ 2µ 2λ 2µ 2λ 2µ2λ 2µ 2λ 2µ 2λ 2µ 02 12 22 32 1λ 1λ 1λ 1µ 1µ 1µ 01 11 21 31 1λ 1λ 1λ 1µ 1µ 1µ 00 10 20 30 1λ 1λ 1λ 1µ 1µ 1µ 03 13 23 33 1λ 1λ 1λ 1µ 1µ 1µ – Is this a reversible MC? 3-111 Multidimensional Markov chains II – Owing to time-reversibility, detailed balance equations hold µ1π(n1 + 1, n2) = λ1π(n1, n2) µ2π(n1, n2 + 1) = λ2π(n1, n2) – Stationary state distribution π(n1, n2) = 1 − λ1 µ1 λ1 µ1 n1 1 − λ2 µ2 λ2 µ2 n2 • Can be generalized for any number of independent queues, e.g., M/M/1, M/M/c or M/M/∞ π(n1, n2, . . . , nK ) = π1(n1)π2(n2) · · · πK (nK ) – ’Product form’ distribution 3-112
  • 29. Truncation of a Reversible Markov chain I X(t) is a reversible Markov process with state space S and stationary distribution, πj for j ∈ S. – Truncated to a set E ⊂ S such that the resulting chain Y (t) is irreducible. Then, Y (t) is reversible and has the stationary distribution ˆπj = πj k∈E πk j ∈ E – This is the conditional prob. that. in steady state, the original process is at state j, given that it is somewhere in E Proof: ˆπjqji = ˆπiqij ⇒ πj k∈E πk ˆπj qji = πi k∈E πk qij ⇒ πjqji = πiqij k∈E ˆπk = j∈E πj k∈E πk = 1 3-113 Truncation of a Reversible Markov chain II Markov processes for M/M/1 and M/M/C are reversible • State probabilities of M/M/1/K queue πi = (1 − ρ)ρi K i=0(1 − ρ)ρi = (1 − ρ)ρi 1 − ρK+1 for ρ = λ µ – Truncated version of M/M/1/∞ queue • State probabilities of M/M/c/c queue – M/M/c/∞ queue with ρ = λ/(mµ) and a = λ/µ πn = ρmax(0,n−c) ac n! π0 – Truncated version of M/M/c/∞ queue ˆπn = πn/ c n=0 πn = an n! / c i=0 ai i! 3-114 Truncation of a Reversible Markov chain III Two independent M/M/1 queues of the previous example share a common buffer of size B (=2) • An arriving customer who finds B customers waiting is blocked Example: Two Queues with Joint Buffer The two independent M/M/1 queues of the previous example share a common buffer of size B – arrival that finds B customers waiting is blocked State space restricted to Distribution of truncated chain: Normalizing: Theorem specifies joint distribution up to the normalization constant Calculation of normalization constant is often tedious 2λ 2µ 2λ 2µ 2λ 2µ 1λ 1µ 2λ 2µ 2λ 2µ 2λ 2µ 1λ 1µ 1λ 1µ 1λ 1µ 2λ 2µ 2λ 2µ 2λ 2µ 02 12 1λ 1µ 01 11 21 1λ 1λ 1µ 1µ 00 10 20 30 1λ 1λ 1µ 1µ 03 13 22 31 1 2 1 2{( , ) :( 1) ( 1) }E n n n n B+ + = − + − ≤ 1 2 1 2 1 2 1 2( , ) (0,0) , ( , )n n p n n p n n Eρ ρ= ⋅ ∈ 1 2 1 2 1 1 2 ( , ) (0,0) n n n n E p ρ ρ − ∈   =     ∑ State diagram for B =2• State space: E = {(n1, n2) : (n1 − 1)+ + (n2 − 1)+ ≤ B} • Stationary state distribution of the truncated MC π(n1, n2) = π(0, 0)ρn1 1 ρn2 2 for (n1, n2) ∈ E • π(0, 0) is obtained by π(0, 0) = 1/ (n1,n2)∈E ρn1 1 ρn2 2 3-115 Truncation of a Reversible Markov chain IV Two session classes in a circuit switching system with preferential treatment for one class for a total of C channels • Type 1: Poisson arrivals with λ1 require exponentially distributed service rate µ1 – admissible only up to K • Type 2: Poisson arrivals with λ2 require exponentially distributed service rate µ2 – can be accepted until C channels are used up S = {(n1, n2)|0 ≤ n1 ≤ K, n1 + n2 ≤ C}374 IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, VOL. 51, NO. 2, MARCH 2002 Fig. 2. Transition diagram for the new call bounding scheme. handoff calls in the cell. Let and . From the detailed balance equation, we obtain From this, the traffic intensities for new calls and handoff calls using the above common average channel holding time 1 3-116
  • 30. Truncation of a Reversible Markov chain V • The state probabilities can be obtained as P(n1, n2) = ρn1 1 n1! ρn2 2 n2! P(0, 0) for 0 ≤ n1 ≤ K, n1+n2 ≤ C, n2 ≥ 0 – P(0, 0) can be determined by n1,n2 P(n1, n2) = 1 • Blocking probability of type 1 Pb1 = C−K n2=0 ρK 1 K! · ρ n2 2 n2! + K−1 n1=0 ρ n1 1 n1! · ρ C−n1 2 (C−n1)! K n1=0 ρ n1 1 n1! C−n1 n2=0 ρ n2 2 n2! • Blocking probability of type 2 Pb2 = K n1=0 ρ n1 1 n1! · ρ C−n2 2 (C−n2)! K n1=0 ρ n1 1 n1! C−n1 n2=0 ρ n2 2 n2! For this kind of systems, blocking probabilities are valid for a broad class of holding time distributions 3-117 Networks of queues Two queues in tandem (BG, p.210) • Assume that service time is proportional to the packet length Queue 1 Queue 1 Queue 2 Queue 2 is empty when arrives Queue 2 time Arrivals at queue 2 get bursty – Interarrival times at the second queue are strongly correlated with the packet length at the first queue or the service time! • The first queue is an M/M/1, but the second queue cannot be considered as an M/M/1 3-118 Kleinrock’s Independence Approximation I In real networks, many queues interact with each other – a traffic stream departing from one or more queues enters one or more other queues, even after merging with other streams departing from yet other queues • Packet interarrival times are correlated with packet lengths. • Service times at various queue are not independent, e.g., state-dependent flow control. Kleinrock’s independence approximation: • M/M/1 queueing model works for each link: merging several packet streams on a transmission line makes interarrival times and packet lengths independent • Good approximation when: * Poisson arrivals at entry points of the network * Packet transmission times ‘nearly’ exponential * Several packet streams merged on each link * Densely connected network and moderate to heavy traffic load 3-119 Kleinrock’s Independence Approximation II Suppose several packet streams, each following a unique path through the network: appropriate for virtual circuit network, e.g., ATM • xs: arrival rate of packet stream s • fij(s): the fraction of the packets of stream s through link (i, j) • Total arrival rate at link (i, j) λij = all packet streams s crossing link (i, j) fij(s)xs 3-120
  • 31. Kleinrock’s Independence Approximation III Based on M/M/1 (with Kleinrock’s Independence approximation), # of packets in queue or service at (i, j) on average is Nij = λij µij − λij – 1/µij is the average packet transmission time on link (i, j) • The average number of packets over all queues and the average delay per packet are N = (i,j) Nij and T = 1 γ (i,j) Nij – γ = s xs: total arrival rate in the system • As a generalization with proc. & propag. delay, Tp = all packet streams s crossing link (i, j)      λij µij(µij − λij) queueing delay + 1 µij + dij      3-121 Kleinrock’s Independence Approximation IV In datagram networks including multiple path routing for some origin-destination pairs, M/M/1 approx. often fails • Node A sends traffic to node B along two links with service rate µSec. 3.6 Networks of Transmission Lines V2 V2 B 213 Figure 3.29 Poisson process with rate . divided among two links. If division is done by randomization, each link behaves like an M IJIII queue. If division is done by metering, the whole system behaves Iike an 1111'v112 queue. (3.103) (3.104) where! = L8.rS is the total arrival rate in the system. If the average processing and propagation delay elij at link (i. j) is not negligible, this formula should be adjusted to I L ( Ai) )T = - . + Aij elij 11" - A"r (i.j) I) 1) Finally, the average delay per packet of a traffic stream traversing a path p is given by '"' (Ai) 1 )Tp = L .. .. _ .. + -. + eli) '. ILI)(p,) AI)) 11i)all (I,)) on path p where the three terms in the sum above represent average waiting time in queue, average transmission time, and processing and propagation delay, respectively. In many networks, the assumption of exponentially distributed packet lengths is not appropriate. Given a different type of probability distribution of the packet lengths, one may keep the approximation of independence between queues but use the P-K formula for average number in the system in place of the AI/M/1 formula (3.100). Equations (3.101) to (3.104) for average delay would then be modified in an obvious way. For virtual circuit networks (cf. Fig. 3.27), the main approximation involved in the II/ M /1 formula (3.101) is due to the correlation of the packet lengths and the packet interarrival times at the various queues in the network. If somehow this correlation was not present (e.g., if a packet upon departure from a transmission line was assigned a new length drawn from an exponential distribution), then the average number of packets in the system would be given indeed by the formula This fact (by no means obvious) is a consequence of Jackson's Theorem, which will be discussed in Section 3.8. In datagram networks that involve multiple path routing for some origin-destination pairs (cf. Fig. 3.28), the accuracy of the M / M /1 approximation deteriorates for another reason, which is best illustrated by an example. Example 3.17 Suppose that node A sends traffic to node B along two links with service rate 11 in the network of Fig. 3.29. Packets arrive at A according to a Poisson process with rate . packets/sec. Packet transmission times are exponentially distributed and independent of interarrival times as in the AI/M / I system. Assume that the arriving traffic is to be divided equally among the two links. However, how should this division be implemented? Consider the following possibilities. – Random splitting: queue at each link may behave like an M/M/1 TR = 1 µ − λ/2 – Metering: arriving packets are assigned to a queue with the smallest backlog → approximated as an M/M/2 with a common queue TM = 2 (2µ − λ)(1 + ρ) < TR ∗ Metering destroys an M/M/1 approximation 3-122 Burke’s theorem I For M/M/1, M/M/c, M/M/∞ with arrival rate λ (without bulk arrivals and service): B1. Departure process is Poisson with rate λ. Forward process Reverse process Reverse process Arrivals of forward process Departures of reverse process • The arrival process in the forward process corresponds to the departure process in the reverse process • Since the arrivals in forward time form a Poisson process, the departures in backward time form a Poisson process • Since the backward process is statistically the same as the forward process, the (forward) departure process is Poisson 3-123 Burke’s theorem II B2. The state (packets in system) left by a forward departure in the forward process is independent of the past departures. Time direction in the forward process Time direction in the reverse process Departure prior to in the forward process Arrival after in the reverse process : Future arrivals does not depend on the current number in the system – In the reverse process, the state is independent of future arrivals. 3-124
  • 32. Two M/M/1 Queues in Tandem The service times of a customer at the first and the second queues are mutually independent as well as independent of the arrival process. Queue 1 Queue 2 • Based on Burke’s theorem B1, queue 2 in isolation is an M/M/1 – Pr[m at queue 2] = ρm 2 (1 − ρ2) • B2: # of customers presently in queue 1 is independent of the sequence of departure time prior to t (earlier arrivals at queue 2) – independent of # of customers presently in queue 2 Pr[n at queue 1 and m at queue 2] = Pr[n at queue 1] Pr[m at queue 2] = ρn 1 (1 − ρ1)ρm 2 (1 − ρ2) 3-125 Open queueing networks Consider a network of K first-come first serve, single server queue, each of which has unlimited queue size and exponential distribution with rate µk. External arrivals routing path • Traffic equation with routing probability pij or matrix P = [pij] λi = αi + K j=1 λjpji and K i=0 pji = 1 – pi0: flow going to the outside – λi can be uniquely determined by solving λ = α + λP ⇒ λ = α(I − P)−1 3-126 Open queueing networks II Let n = (n1, . . . , nK ) denote a state (row) vector of the network. The limiting queue length distribution π(n) π(n) = lim t→∞ Pr[X1(t) = n1, . . . , XK (t) = nK ] Global balance equation (GBE): total rate out of n = total rate into n α + K i=1 µi π(n) = K i=1 αiπ(n − ei) external arrivals + K i=1 pi0µiπ(n + ei) go outside from i + K i=1 K j=1 pjiµjπ(n + ej − ei) from j to i – ei = (0, . . . , 1, . . . , 0), i.e., the 1 is in the ith position – π(n − ei) denotes π(n1, n2, . . . , ni − 1, . . . , nK ) 3-127 Jackson’s theorem I Using time-reversibility, guess detailed balance equations (DBEs) as λiπ(n − ei) = µiπ(n), λiπ(n) = µiπ(n + ei) and λjπ(n − ei) = µjπ(n + ej − ei) based on Substituting DBEs into GBE gives us RHS =π(n) K i=1 αiµi λi + K i=1 pi0λi + K i=1 K j=1 pjiλj λi µi =π(n) K i=1 pi0λi α + K i=1 αi + K j=1 pjiλj λi µi – in the numerator: λi = αi + K j=1 pjiλj 3-128
  • 33. Jackson’s theorem II From DBEs, we have π(n1, . . . , ni, . . . , nK ) = λi µi π(n1, . . . , ni − 1, . . . , nK ) and π(n1, . . . , ni − 1, . . . , nK ) = λi µi π(n1, . . . , ni − 2, . . . , nK ) which is finally rearranged as π(n1, . . . , ni, . . . , nK ) = λi µi ni π(n1, . . . , 0, . . . , nK ) Repeating for i = 1, 2, . . . , K, π(n) = π(0) K i=1 λi µi ni – π(0) = K i=1 ∞ ni =0 ρni i and ρi = λi/µi 3-129 Jackson’s theorem: proof of DBEs I Proving DBEs based on time-reversibility • Construct a routing matrix, P∗ = [p∗ ij], of the reversed process • The rate from node i to j must be the same in the forward and reverse direction, (forward process) λipij = λjp∗ ji (reverse process) – λjp∗ ji: the output rate from server j is λj, and p∗ ji is the rate of moving from j to i; α∗ i = λipi0; p∗ i0 = αi/λi We need to show (recall θiγij = θjγ∗ ji) π(n)vn,m = π(m)v∗ m,n and m vn,m = m v∗ n,m – vn,m and v∗ n,m denote state transition rate of the forward and reversed process 3-130 Jackson’s theorem: proof of DBEs II We need to consider the following three cases • Arrival to server i outside the network in the forward process corresponds to a departure out of the network from server i in the reversed process π(n)vn,n+ei = π(n + ei)v∗ n+ei ,n • Departure to the outside in the forward process corresponds to arrival from the outside in the reversed process, π(n)vn,n−ei = π(n − ei)v∗ n−ei ,n • Leaving queue i and joining queue j in the forward process (vn,n−ei +ej = µipij) correspond to leaving queue j and joining queue i in the reversed process (v∗ n−ei +ej ,n = µjp∗ ji = λipijµj/λj) π(n)vn,n−ei +ej = π(n − ei + ej)v∗ n−ei +ej ,n 3-131 Jackson’s theorem: proof of DBEs III 1) π(n)vn,n+ei = π(n + ei)v∗ n+ei ,n: Arrival to server i outside the network in the forward process corresponds to a departure out of the network from server i in the reversed process, i.e., v∗ n+ei ,n =µi 1 − K j=1 p∗ ij p∗ i0 Use p∗ ij =λj pji /λi = µi 1 − K j=1 λjpji λi = µi λi λi − K j=1 λjpji λi =αi + K j=1 λj pji = αi/ρi (= v∗ n,n−ei ). Substituting this into 1) (vn,n+ei = αi: arrival to server i from outside) K i=1 πi(ni)αi = πi(ni + 1) K j=1,j=i πj(nj)αi/ρi 3-132
  • 34. Jackson’s theorem: proof of DBEs IV Rearranging the previous eqn. yields πi(ni)αi K j=1,j=i πj(nj) = πi(ni + 1)(αi/ρi) K j=1,j=i πj(nj) After canceling, we have πi(ni + 1) = ρiπi(ni) ⇒ πi(n) = ρn i (1 − ρi) 2) π(n)vn,n−ei = π(n − ei)v∗ n−ei ,n: Departure to the outside in the forward process corresponds to arrival from the outside in the reversed process, v∗ n−ei ,n = αi = λi − K j=1 λjp∗ ji Traffic eqn. for the reversed process = λi − K j=1 λj λipij λj =λi 1 − K j=1 pij = λipi0 (= v∗ n,n+ei ). 3-133 Jackson’s theorem: proof of DBEs V Substituting this with vn,n−ei = µipi0 (departure to the outside), (1 − ρi)ρni i K k=1,k=i πk(nk)µipi0 = (1 − ρi)ρni −1 i K k=1,k=i πk(nk)λipi0 3) π(n)vn,n−ei +ej = π(n − ei + ej)v∗ n−ei +ej ,n: Leaving queue i and joining queue j in the forward process (vn,n−ei +ej = µipij) correspond to leaving queue j and joining queue i in the reversed process, i.e, v∗ n−ei +ej ,n = µjp∗ ji = λipijµj/λj, (1 − ρi)ρni i (1 − ρj)ρ nj j K k=1,k=i,j πk(nk)µipij = (1 − ρi)ρni −1 i (1 − ρj)ρ nj +1 j K k=1,k=i,j πk(nk)µjp∗ ji use p∗ ji =λi pij /λj 3-134 Jackson’s theorem: proof of DBEs VI Summary of transition rates of forward and reverse processes Transition Forward vn,m Reverse v∗ n,m Comment n → n + ei αi λi(1 − K j=1 pij) all i n → n − ei µi(1 − K j=1 pij) αiµi/λi all i: ni > 0 n → n − ei + ej µipij λjpjiµi/λi all i: ni > 0, all j 4) Finally, we verify total rate equation, m vn,m = m v∗ n,m: v∗ n,m = i λi 1 − K j=1 pij i λi − i j λi pij + i:ni >0 αiµi/λi + j λjpjiµi/λi = i λi − j (λj − αj) λj =αj + K i=1 λi pij + i:ni >0 αiµi λi + µi λi (λi − αi) = i αi + i:ni >0 µi = vn,m. 3-135 Open queueing networks: Extension I The product-form solution of Jackson’s theorem is valid for the following network of queues • State-dependent service rate – 1/µi(ni): the mean of queue i’s service time exponentially distributed, when ni is the number of customers in the ith queue just before the customer’s departure ρi(ni) = λi µi(ni) , i = 1, . . . , K, ni = 1, 2, . . . – λi: total arrival rate at queue i determined by the traffic eqn. – Define ˆPj(nj) as γij = 1, if nj = 0, ρj(1)ρj(2) · · · ρj(nj), if nj > 0 3-136
  • 35. Open queueing networks: Extension II – For all state n = (n1, . . . , nK ) P(n) = ˆP1(n1)ˆP2(n2) · · · ˆPk(nK ) G , where G = ∞ n1=0 · · · ∞ nK =0 ˆP1(n1) · · · ˆPk(nK ) • Multiple classes of customers – Provided that the service time distribution at each queue is the same for all customer classes, the product form solution is valid for the system with different classes of customers, i.e., λj(c) = αj(c) + K i=1 λi(c)pij(c) – αj(c): rate of the external arrival of class c at queue j; pij(c) the routing probabilities of class c – See pp.230-231 in the textbook for more details 3-137 Open queueing networks: Performance measure Performance measure • State probability distribution has been derived • Mean # of hops traversed, h, is h = λ α = K i=1 λi K i=1 αi • Throughput of queue i: λi • Total throughput of the queueing network: α • Mean number of customers at queue i (ρi = λi/µi) Ni = ρi/(1 − ρi) • System response time T T = N α = 1 α K i=1 Ni = 1 α K i=1 λiTi = 1 α K i=1 λi µi − λi 3-138 Open queueing networks: example A-I New programs arrive at a CPU according to a Poisson process of rate α. A program spends an exponentially distributed execution time of mean 1/µ1 in the CPU. At the end of this service time, the program execution is complete with probability p or it requires retrieving additional information from secondary storage with probability 1 − p. Suppose that the retrieval of information from secondary storage requires an exponentially distributed amount of time with mean 1/µ2. Find the mean time that each program spends in the system. 3-139 Open queueing networks: example A-II Find the mean arrival rate, • Arrival rate into each queue, λ1 = α + λ2 and λ2 = (1 − p)λ1 λ1 = α/p and λ2 = (1 − p)α/p • Each queue behaves like an M/M/1 system, so E[N1] = ρ1 1 − ρ1 and E[N2] = ρ2 1 − ρ2 where ρ1 = λ1/µ1 and ρ2 = λ2/µ2 Using Little’s result, the total time spent in the system E[T] = E[N1 + N2] α = 1 α ρ1 1 − ρ1 + ρ2 1 − ρ2 3-140
  • 36. Open queueing networks: example B-I Consider the following network with three nodes N = M i=1 Ni = M i=1 ρi 1 − ρi ; T = N γ = M i=1 ( λi γ )Ti eg. Consider the following networks with three routers A B CγA γB γC L1 L3 L2 L4 • External packet arrivals : Poisson process with γA = 3.5 pack- ets/sec, γB = 1.5, γC = 1.5. • Packet length : exponentially dis- tributed with mean 1000 bits/packet. • External packet arrivals : Poisson process with γA = 350 (packets/sec), γB = 150, γC = 150. • Packet length : exponentially distributed with mean 50 (kbits/packet) Assumptions: (a) Packets moving along a path from source to destination have their lengths selected independently at each outgoing link → Kleinrock’s independence assumption (b) Channel capacity of link i : Ci= 17.5 Mbps for i = 1, 2, 3, 4 → Service rate at link i: exponentially distributed with rate µi = Ci/50000 = 350 packets/sec. 3-141 Open queueing networks: example B-II • Traffic matrix (packets per second) from → to A B C A – 150 200 (50% through B) (50% directly to C) B 50 – 100 C 100 50 – • Find mean delay from A to C • First, we need to know link traffic traffic type L1 L2 L3 L4 A → B 150 A → C 100 100 100 B → A 50 50 B → C 100 C → A 100 C → B 50 50 total λ1 = 300 λ2 = 100 λ3 = 250 λ4 = 200 3-142 Open queueing networks: example B-II • Since α = 650 and λ = 850, the mean number of hops is h = 850/650 = 1.3077 • We get link utilization, mean number and response time as L1 L2 L3 L4 ρi 300/350=0.857 100/350=0.286 250/350=0.714 200/350 =0.572 Ni 300/50 100/250 250/100 200/150 Ti 100/50=2 100/250=0.4 100/100=1 100/150=0.667 – Ni = ρi/(1 − ρi) and Ti = λiNi • Mean delay from A to C TAC = ( T1 A to B + T2 B to C ) × 0.5 + T3 × 0.5 = 1.7 (sec) – propagation delay is ignored 3-143 Closed queueing networks I Consider a network of K first-come first serve, single server queue, each of which has unlimited queue size and exponential distribution with rate µk. There are also a fixed number of customers, say M, circulate endlessly in a closed network of queues. • Traffic eqn.: no external arrival! λi = K j=1 λjpji with K i=0 pji = 1 3-144
  • 37. Closed queueing networks II • Using π = π · P, and π · 1 = 1, we have λi = λ(M)πi – λ(M): a constant of proportionality, the sum of the arrival rates in all the queues in the network – K i=1 λi = 1 Assuming ρi = λi/µi < 1 for i = 1, . . . , K, we have for all ni ≥ 0, π(n) = 1 G(M) K i=1 ρni , and G(M) = n1+···+nK =M K i=1 ρni i • ρi is no longer the actual utilization due to λ(M) • Setting λ(M) to a value does not change the results • Since there are M customers, the maximum queue size of each queue is M 3-145 Closed queueing networks III Proof: as in Jackson’s theorem for open queueing networks • Use time-reversibility: routing matrix of the reversed process • For state transition between n and n = n − ei + ej π(n )v∗ n ,n = π(n)vn,n (∗) • As in open queueing networks, we have v∗ n−ei +ej ,n =µjp∗ ji = µj(λipij/λj) vn,n−ei +ej =µipij for ni > 0 • Substituting these into (*), we have ρiπ(n1, . . . , ni − 1, . . . , nj + 1, . . . , nK ) = ρjπ(n1, . . . , nK ) • The proof for the following is given on page 235 m vn,m = m v∗ n,m 3-146 Closed queueing networks IV Computing G(M, K) with M customers and K queues iteratively G(m, k) = G(m, k − 1) + ρkG(m − 1, k) with boundary conditions: G(m, 1) = ρm 1 for m = 0, 1, . . . , M, and G(0, k) = 1 for k = 1, 2, · · · , K • For m > 0 and k > 1, split the sum into two disjoint sums as G(m, k) = n1+···+nk =m ρn1 1 ρn2 2 · · · ρnk k = n1+···+nk =m, nk =0 ρn1 1 ρn2 2 · · · ρnk k + n1+···+nk =m, nk >0 ρn1 1 ρn2 2 · · · ρnk k = n1+···+nk =m, nk =0 ρn1 1 ρn2 2 · · · ρ nk−1 k−1 G(m,k−1) + n1+···+nk =m, nk >0 ρn1 1 ρn2 2 · · · ρnk k 3-147 Closed queueing networks V • Since nk > 0, we change nk = nk + 1 for nk ≥ 0 n1+···+nk =m, nk >0 ρn1 1 ρn2 2 · · · ρnk k = n1+···+nk +1=m, nk >0 ρn1 1 ρn2 2 · · · ρ nk +1 k =ρk n1+···+nk =m−1, nk >0 ρn1 1 ρn2 2 · · · ρ nk k =ρkG(m − 1, k) In a closed Jackson network with M customers, the probability that at steady-state, the number of customers in station j greater than or equal to m is Pr[xj ≥ m] = ρm j G(M − m) G(M) for 0 ≤ m ≤ M 3-148
  • 38. Closed queueing networks VI • Proof: nj = nj + m for nj ≥ 0 Pr[xj ≥ m] = n1+·+nj +···+nK =M, nj ≥m ρn1 1 · · · ρ nj j · · · ρnK K G(M) = n1+·+nj +m+···+nK =M, nj +m≥m ρn1 1 · · · ρ nj +m j · · · ρnK K G(M) = ρm j G(M) n1+·+nj +···+nK =M−m, nj ≥m ρn1 1 · · · ρ nj j · · · ρnK K = ρm j G(M) G(M − m) • Pr[xj = m] = Pr[xj ≥ m] − Pr[xj ≥ m + 1] = ρm j (G(M − m) − ρjG(M − m − 1))/G(M) 3-149 Closed queueing networks VII In a closed Jackson network with M customers, the average number of customers at queue j: Nj(M) = M m=1 Pr[xj ≥ m] = M m=1 ρm j G(M − m) G(M) In a closed Jackson network with M customers, the average throughput of queue j: γj(M) =µj Pr[xj ≥ 1] = µjρj G(M − 1) G(M) =λj G(M − 1) G(M) – Average throughput is the average rate at which customers are serviced in the queue. For a single-server queue the service rate is µj when there are one or more customers in the queue, and 0 when the queue is empty 3-150 Closed queueing networks: example I Suppose that the computer system given in the open queueing network is now operated so that there are always I programs in the system. Note that the feedback loop around the CPU signifies the completion of one job and its instantaneous replacement by another one. Find the steady state pmf of the system. Find the rate at which programs are completed. • Using λi = λ(I)πi with π = πP, π1 = pπ1 + π2, π2 = (1 − p)π1 and π1 + π2 = 1 we have λ1 = λ(I)π1 = λ(I) 2 − p and λ2 = λ(I)π2 = λ(I)(1 − p) 2 − p 3-151 Closed queueing networks: example II • For 0 ≤ i ≤ I, ρ1 = λ1/µ1 and ρ2 = λ2/µ2 Pr[N1 = i, N2 = I − i] = (1 − ρ1)ρi 1(1 − ρ2)ρI−i 2 S(I) • The normalization constant, S(I), is obtained by S(I) = (1−ρ1)(1−ρ2) I i=0 ρi 1ρI−i 2 = (1−ρ1)(1−ρ2)ρI 2 1 − (ρ1/ρ2)I+1 1 − (ρ1/ρ2) • We then have for 0 ≤ i ≤ I Pr[N1 = i, N2 = I − i] = 1 − β 1 − βI+1 βi where β = ρ1/ρ2 = µ2/((1 − p)µ1) • Program completion rate is pλ1: λ1/µ1 = Pr[N1 = 0] 3-152
  • 39. Arrival theorem for closed networks I Theorem: In a closed Jackson network with M customers, the occupancy distribution seen by a customer upon arrival at queue j is the same as the occupancy distribution in a closed network with the arriving customer removed • In a closed network with M customers, the expected number of customers found upon arrival by a customer at queue j is equal to the average number of customers at queue j, when the total number of customers in the closed network is M − 1 • An arriving customer sees the system at a state that does not include itself Proof: • X(t) = [X1(t), X2(t), . . . , XK (t))]: state of the network at time t • Tij(t): probability that a customer moves from queue i to j at time t+ 3-153 Arrival theorem for closed networks II • For any state n with ni > 0, the conditional probability that a customer moving from node i to j finds the network at state n αij(n) = Pr[X(t) = n|Tij(t)] = Pr[X(t) = n, Tij(t)] Pr[Tij(t)] = Pr[Tij(t)|X(t) = n] Pr[X(t) = n] m,mi >0 Pr[Tij(t)|X(t) = m] Pr[X(t) = m] = π(n)µipij m,mi >0 π(m)µipij = ρn1 1 · · · ρni i · · · ρnK K m,mi >0 ρm1 1 · · · ρmi i · · · ρmK K – Changing mi = mi + 1, mi ≥ 0, αij(n) = ρn1 1 · · · ρni i · · · ρnK K m1+···+mi +1+···+mK =M, mi +1>0 ρm1 1 · · · ρ mi +1 i · · · ρmK K = ρn1 1 · · · ρni −1 i · · · ρnK K m1+···+mi +···+mK =M−1,mi ≥0 ρm1 1 · · · ρ mi i · · · ρmK K = ρn1 1 · · · ρni −1 i · · · ρnK K G(M − 1) 3-154 Mean Value Analysis I Performance measure for closed networks with M customers • Nj(M): average number of customers in queue j • Tj(M): average time a customer spends (per visit) in queue j • γj(M): average throughput of queue j Mean-Value Analysis: Calculates Nj(M) and Tj(M) directly, without first computing G(M) or deriving the stationary distribution of the network a) The queue length observed by an arriving customer is the same as the queue length in a closed network with one less customer b) Little’s result is applicable throughout the network 1. Based on a) Tj(s) = 1 µj (1 + Nj(s − 1)) for j = 1, . . . , K, s = 1, . . . , M – Tj(0) = Nj(0) = 0 for j = 1, . . . , K 3-155 Mean Value Analysis II 2. Based on b), we first have when there are s customers in the network E[Nj(s)] = λj(s)E[Tj(s)] = λ(s)πjE[Tj(s)] step 2-b and s = K j=1 E[Nj(s)] = λ(s) K j=1 πjE[Tj(s)] → λ(s) = s K j=1 πjE[Tj(s)] step 2-a This will be iteratively done for s = 0, 1, . . . , M 3-156
  • 40. Where are we? Elementary queueing models – M/M/1, M/M/C, M/M/C/C/K, ... and bulk queues – either product-form solutions or use PGF Intermediate queueing models (product-form solution) – Time-reversibility of Markov process – Detailed balance equations of time-reversible MCs – Multidimensional Birth-death processes – Network of queues: open- and closed networks Advanced queueing models – M/G/1 type queue: Embedded MC and Mean-value analysis – M/G/1 with vacations and Priority queues – G/M/m queue More advanced queueing models (omitted) – Algorithmic approaches to get steady-state solutions 3-157 Residual life time∗ I Hitchhiker’s paradox: Cars are passing at a point of a road according to a Poisson process with rate λ = 1/10, i.e., 10 min. A hitchhiker arrives to the roadside point at random instant of time. Hitchhiker arrives Next carPrevious car time What is his mean waiting time for the next car? 1. Since he arrives randomly in an interval, it would be 5 min. 2. Due to memoryless property of exponential distribution, it would be another 10 min. ∗ L. Kleinrock, Queueing systems, vol.1: theory 3-158 Residual life time II The distribution of an interval that the hitchhiker captures depends on both X and fX (x): fX (x) = CxfX (x) and C : proportional constant Since ∞ 0 fX (x)dx = 1, we have C = 1/E[X] = 1/X: fX (x) = xfX (x) X Since Pr[R < y|X = x] = y/x for 0 ≤ y ≤ x, joint pdf of X and R : Pr[y < R < y + dy, x < X < x + dx] = dy x xfX (x)dx X = fX (x)dydx X Unconditioning over X , fR (y)dy = dy X ∞ y fX (x)dx = 1 − FX (y) X dy ⇒ fR (y) = 1 − FX (y) X 3-159 Residual life time III If we take the Laplace transform of the pdf of R for 0 ≤ R ≤ x, E[e−R s |X = x] = x 0 e−sx x dy = 1 − e−sx sx Unconditioning over X , we have R ∗ (s) and its moments as R ∗ (s) = 1 − F∗ X (s) sX ⇒ E[R n ] = X(n+1) (n + 1)X where F∗ X (s) = ∞ 0 e−sx fX (t)dt. Surprisingly, the distribution of the elapsed waiting time, X − R , is identical to that of the remaining waiting time. 3-160
  • 41. M/G/1 queue: Embedded MC I Recall that a continuous-time MC is described by (n, r): • n: number of customers in the system. • r: attained or remaining service time of the customer in service. Due to x, (n, x) is not a countable state space. How can we get rid of x? What if we observe the system at the end of each service? Xn+1 = max(Xn − 1, 0) + Yn+1 Xn: number of customers in the system left behind by a departure. Yn: number of arrivals that occur during the service time of the departing customer. Question: Xn is equal to the queue length seen by an arriving customer (queue length just before arrival)? Recall PASTA. 3-161 Distribution Upon Arrival or Departure α(t), β(t): number of arrivals and departures (respectively) in (0, t) Un(t): number of times the system goes from n to n + 1 in (0, t); number of times an arriving customer finds n customers in the system Vn(t): number of times that the system goes from n + 1 to n; number of times a departing customer leaves n. the transition n to n+1 cannot reoccur until after the number in the system drops to n once more (i.e., until after the transition n +1 to n reoccurs) Un(t) and Vn(t) differ by at most one: |Un(t) − Vn(t)| ≤ 1. lim t→∞ Un(t) t = lim t→∞ Vn(t) t ⇒ lim t→∞ Un(t) α(t) α(t) t = lim t→∞ Vn(t) β(t) β(t) t 3-162 M/G/1 queue: Embedded MC II Defining probability generating function of distribution Xn+1, Qn+1(z) E[zXn+1 ] = E[zmax(Xn−1,0)+Yn+1 ] = E[zmax(Xn−1,0) ]E[zYn+1 ] Let Un+1(z) = E[zYn+1 ], as n → ∞, Un+1(z) = U(z) (independent of n). Then, we have Qn+1(z) =U(z) ∞ k=0 zk Pr[max(Xn − 1, 0) = k] =U(z) z0 Pr[Xn = 0] + ∞ k=1 zk−1 Pr[zXn = k] =U(z) Pr[Xn = 0] + z−1 (Qn(z) − Pr[Xn = 0]) As n → ∞, we have Qn+1(z) = Qn(z) = Q(z), and Pr[Xn = 0] = q0, Q(z) = U(z)(z − 1) z − U(z) q0. 3-163 M/G/1 queue: Embedded MC III We need to find U(z) and q0. Using U(z|xi = x) = eλx(z−1) , U(z) = ∞ 0 U(z|xi = x)b(x)dx = B∗ (λ(1 − z)). Since Q(1) = 1, we have q0 = 1 − U (1) = 1 − λ · X = 1 − ρ. Transform version of P-K formula is Q(z) = B∗ (λ(1 − z))(z − 1) z − B∗(λ(1 − z)) (1 − ρ). Letting q = Q (1), one gets W = q/λ − X. Sojourn time distribution of an M/G/1 system with FIFO service: If a customer spends Tj sec in the system, the number of customers it leaves behind in the system is the number of customers that arrive during these Tj sec, due to FIFO. 3-164
  • 42. M/G/1 Queue: Embedded MC IV Let fT (t) be probability density function of T, i.e., total delay. Q(z) = ∞ k=0 zk ∞ 0 (λt)k k! e−λt fT (t)dt = T∗ (λ(1 − z)) where T∗ (s) is the Laplace transform of fT (t). We have T∗ (λ(1 − z)) = B∗ (λ(1 − z))(z − 1) z − B∗(λ(1 − z)) (1 − ρ) Let s = λ(1 − z), one gets T∗ (s) = (1 − ρ)sB∗ (s) s − λ + λB∗(s) = W ∗ (s)B∗ (s) ⇒ W ∗ (s) = (1 − ρ)s s − λ + λB∗(s) In an M/M/1 system, we have B∗ (s) = µ/(s + µ): W ∗ (s) = (1 − ρ) 1 + λ s + µ − λ 3-165 Delay analysis of an ARQ system Suppose Go-Back-N ARQ system, where a packet is successfully transmitted with probability 1 − p • Packet arrivals to a transmitter’s queue follows Poisson with mean λ (packets/slot) The first two moments of the service time are x (X X) x (00 00 00)- 2 k k k 2 2k =(l-p) +n We now note that X ,",pk = _1_, I-p k=O X '"'kk _ _P_ P - (I _ )2' k=O P Effective service time of packet 1 Effective service time of packet 2 ,. II .. 'II Start of effective service time of packet 4 Error Final transmission of packet 1 Error Final transmission Correct Error of packet 2 -Packets Transmitted Error Figure 3.17 Illustration of the effective service times of packets in the ARQ system of Example 3.15. For example, packet 2 has an effective service time of n + 1 because there was an error in the first attempt to transmit it following the last transmission of packet 1. but no error in the second attempt. • We need the first two moments of the service time to use P-K formula X = ∞ k=0 (1 + kW )(1 − p)pk = 1 + Wp 1 − p X2 = ∞ k=0 (1 + kW )2 (1 − p)pk = 1 + 2Wp 1 − p + W 2 (p + p2 ) (1 − p)2 3-166 M/G/1 queue: Mean value analysis I For queueing systems with a general and independent service time distribution, G, a continuous-time MC is described by (n, r): • n: number of customers in the system. • r: attained or remaining service time of the customer in service. Wi = Ri + i−1 j=i−Ni Xj where Wi, Ri and Xj are waiting time in queue of customer i, residual service time seen by customer i, and service time of customer j. Taking expectations and using the independence among Xj, E[Wi] W = E[Ri] + E i−1 j=i−Ni E[Xj|Ni] = Ri + 1 µ Nq Since Nq = λW , Ri = R for all i, we have W = R/(1 − ρ). 3-167 M/G/1 queue: Mean value analysis II Time averaged residual time of r(τ) in the interval [0, t] is R(t) = 1 t t 0 r(τ)dτ = 1 t M(t) i=1 1 2 X2 i = 1 2 M(t) t M(t) i=1 X2 i M(t) – M(t) is the number of service completion within [0, t]. Residualservicetime time Upon a new service of duration , starts at and decays linearly for time units. As t → ∞, R(∞) = R = λX2/2, From the hitchhiker’s paradox, E[R ] = E[X2 ]/(2E[X]): R = 0 · Pr[N(t) = 0] + E[R ] Pr[N(t) > 0] = E[X2 ] 2E[X] λE[X] = λX2 2 . 3-168
  • 43. M/G/1 queue: Mean value analysis III Pollaczek-Khinchin (P-K) formula for mean waiting time in queue W = λX2 2(1 − ρ) X2=σ2 X +X 2 = λ(σ2 X + X 2 ) 2(1 − ρ) = 1 + C2 x 2 ρ 1 − ρ X = 1 + C2 x 2 WM/M/1 – C2 x = σ2 X /E[X]2 is the coefficient of variation of the service time. Eg.: since Cx = 1 in an M/M/1 and Cx = 0 in an M/D/1, WM/M/1 = ρ 1 − ρ X > WM/D/1 = ρ 2(1 − ρ) X 3-169 M/G/1 Queue with vacations I Server takes a vacation at the end of each busy period • Take an additional vacation if no customers are found at the end of each vacation: V1, V2, ... the durations of the successive vacations • A customer finds the system idle (vacation), waits for the end of the vacation period Packet arrivals x, x x v2 V3 X5 V4 X5 ::::::::'':'::':'':'::'V, Busy period Vacations Time Figure 3.12 An M/G/I1 system with vacations. At the end of a busy period, the server goes on vacation for time V with first and second moments V and V Y , respectively. If the system is empty at the end of a vacation, the server takes a new vacation. An arriving customer to an empty system must wait until the end of the current vacation to get service. X, X3 X, Figure 3.13 Residual service times for an M/G/1 system with vacations. Busy periods alternate with vacation periods. II • Residual service time including vacation periods 1 t t 0 r(τ)dτ = 1 t M(t) i=1 1 2 X2 i + 1 t L(t) i=1 1 2 V 2 i – M(t): # of services completed by time t – L(t): # of vacations completed by time t 3-170 M/G/1 Queue with vacations II • Residual service time including vacation periods is rewritten as 1 t t 0 r(τ)dτ R as t→∞ = M(t) t λ as t→∞ · M(t) i=1 1 2 X2 i M(t) + L(t) t 1−ρ V as t→∞ · L(t) i=1 1 2 V 2 i L(t) = λX2 2 + (1 − ρ)V 2 2V = R • Using W = R/(1 − ρ), we have W = λX2 2(1 − ρ) + V 2 2V – The sum of waiting time in M/G/1 queue and residual vacation times 3-171 M/G/1 Queue with vacations: Embedded Markov chain approach Observing queue either at the end of a vacation or a service period • αm: probability of m customers found at the end of a vacation period αk = ∞ 0 (λx)k k! e−λx v(x)dx – v(x) is the pdf of the length of a vacation period • ak: probability of k customers found at the end of a service time ak = ∞ 0 (λx)k k! e−λx b(x)dx • Combining the above, we have GBE as πk = π0 k+1 m=1 αmak−m+1 + k+1 j=1 πjak−j+1 3-172