CS457 - System Performance Evaluation - Winter 2010
Public Service Announcements
- The weeks ahead
- class on assignment 3
- possible tutorial
- Issues out of which on which exam questions could be based.
- anything in the notes with a question mark
- exercises for the reader
Lecture 25 - Queuing Theory
Analytic Queueing Theory
Concepts We Know
Parameters
With some change in notation.
- \lambda - arrival rate
1/\lambda - mean interrival time
- 1/\mu - mean service time
\mu - service rate
Carefully distinguish between
- Throughput - X - metric - jobs done per second
- Service rate - \mu - parameter - jobs that can be done per second when
the server is processing
Stable system
Utilization
Little's Law
XE(r) = E(n)
New Concepts
Stochastic process
A sequence of random variables indexed by time: S0, S1, S2, ...
- E.g. state of a system at time t. Why?
Markov process
A stochastic process in which S(n+1) is independent of S(0), ..., S(n-1),
but may depend on S(n).
- We could say something like, "The state of the system incorporates the
past completely."
- Discrete event simulation implicitly assumes that the system is
Markovian.
In performance evaluation, when we talk of a Markov process we usually
also mean that the next transition occurs at a time distributed by an
exponential distribution. Why? (You should already know this.)
- In a Markov process with exponentially distributed transitions the mean
transition rate is the only parameter that can vary with state.
- That is, if in state S(j) then the transition rate is \lambda_j
Another useful property of the exponential distribution
- Y1 - arrival time - exponential - \lambda
- Y2 - service time - exponential - \mu
- Ym - when the first of an arrival or end of service occurs - Ym =
min(Y1, Y2)
- P(Ym < y) = P(Y1 < y AND Y2 < y) = F1(y) * F2(y) =
exp(-\lambda y) * exp( -\mu y) = exp( -(\lambda + \mu) * y )
- Ym is exponentially distributed with mean 1 / (\lambda + \mu)
- Why does this matter?
Birth-death processes
A special process where only transitions to neighbouring states are
possible That is, if we are in S(j) then the next state can be
- S(j-1), a death occurs.
Death rate \mu_j.
- S(j+1), a birth occurs
Birth rate \lambda_j.
- S(j), neither a birth nor a death occurs, or both a birth and death
occur.
We now want to examine what happens, in a birth-death process, in the
short time between t and t + \Delta t. Use the Poisson distribution.
- P(exactly one birth | n(t) = j ) = (\lambda_j \Delta t) *
exp(-\lambda_j \Delta t) = \lambda_j \Delta t + terms that are quadratic
or higher order in \Delta t
- Write `terms that are quadratic or higher order in \Delta t' as o(
(\Delta t)^2 )
- lim_x->0 o( x^2 ) / x = 0
- P(exactly one death | n(t) = j ) = \mu_j \Delta t + o( (\Delta t)^2
)
- P(exactly zero births | n(t) = j ) = exp(-\lambda_j \Delta t) = 1 -
\lambda \Delta t + o( (\Delta t)^2 )
- P(exactly zero deaths | n(t) = j ) = 1 - \mu_j \Delta t + o( (\Delta
t)^2 )
- P(more than one birth and/or death) = o( (\Delta t)^2 )
Exercise for the reader. Explain how it is that this is equivalent to only
Arrival and Departure events affecting the system state.
What happens between t and \Delta t ? Four possibilities
- Exactly one birth
- j-1 -> j
- P(n(t+\Delta t) = j AND n(t) = j-1) = P(n(t+\Delta t) = j | n(t) =
j-1)P(n(t) = j-1) = (\lambda_(j-1) \Delta t + o(\Delta t))P(n(t) =
j-1)
- Abbreviate the notation P(n(t) = j) = pj(t) (This is the notation
used in the text.
- P(n(t+\Delta t) = j AND n(t) = j-1) = (\lambda_(j-1) \Delta t)
p_j-1(t)
- Exactly one death
- j+1 -> j
- P(n(t+\Delta t) = j AND n(t) = j+1) = (\mu_(j+1) \Delta t)
p_(j+1)(t)
- Neither a birth nor a death
- j -> j
- P(n(t+\Delta t) = j AND n(t) = j) = 1 - (\lambda_j + \mu_j)
p_j(t)
- Something else
- Nothing. Why? Add P(1) + P(2) + P(3)
The book's notation: P(n(t) = j) = pj(t)
Then
- p_j(t+dt) = p_j(t) + ( \lambda_j-1 p_j-1(t) + \mu_j+1 p_j+1(t) -
(\lambda_j + \mu_j) p_j(t) )dt + o(dt)
- (p_j(t+dt) - p_j(t) ) / dt = \lambda_j-1 p_j-1(t) + \mu_j+1 p_j+1(t) -
(\lambda_j + \mu_j) p_j(t) + o(dt) / dt
- In the limit dt -> 0 dp_j(t) / dt = \lambda_j-1 p_j-1(t) + \mu_j+1
p_j+1(t) - (\lambda_j + \mu_j) p_j(t)
This equation looks as though you could solve it!
Steady State Solutions
Reminder about transient versus steady state.
Definition of steady state: dp_j(t) / dt = 0.
Then
- \lambda_j-1 p_j-1(t) + \mu_j+1 p_j+1(t) - (\lambda_j + \mu_j) p_j(t) =
0
Solve this iteratively
- l0 p0 = m1 p1 => p1 = (l0/m1) p0
- (l1 + m1) p1 = l0 p0 + m2 p2 => (l1 + m1) (l0/m1) p0 = l0 p0 + m2 p2
=> p2 = p0 (l1*l0 / m1*m0)
- ...
- pn = p0 (l_n-1*...*l1*l0 / m_n-1 *...*m1*m0)
- Set p0 by the condition sum_n pn = 1.
This looks like you could solve it further, but you can't. How could you
possibly solve the differential equation above, in that case?
What do we do? Try simplified examples.
The Simplest Example M/M/1
First M: Markovian (Exponential) birth (interarrival) times
Second M: Markovian (Exponential) life/death (service) times
1: one server
Assumptions:
- \lambda_j = \lambda
- \mu_j = \mu
- Define r = \lambda / \mu
Then
- pn = r^n p0
- p0 = 1 / (1 + r + r^2 + ...) = 1 - r.
- pn = (1 - r) * r^n.
The mean number of jobs in the system is E(r) = sum_n n * (1 - r ) * r^n =
r / (1 - r)
- Goes to infinity as r -> 1. Why?
Little's law
- Mean response time: E(r) = (1/\lambda) * E(n) = 1 / (\mu - \lambda)
- Goes to infinity as \lambda -> \mu from below. Why?
- What happens when \lambda > \mu?
Return to: