CS457 - System Performance Evaluation - Winter 2010

Public Service Announcements

• class on assignment 3
• possible tutorial

Lecture 26 - Statistical Mechanics

Two State System

• S0 <--> S1
• One every N seconds flip a coin
• Tails: stay in the same state
• Start in state 0
• Get out my trusty table of random numbers
• Run 35 sequential trials
• Create 50 sequences
• For P(Heads) = 0.50, 0.25, 0.10
• What do I get?
• 5250 random 0s and 1s
• Separated into 150 sequences of 35
• Sequences are in 3 groups of 50
• I see some regularities
• Runs get longer as p decreases: makes sense.
• Averages start low and increase, ending up about 0.50  time 0 1 2 3 4 5 6 7 8 9 Beyond 10: average p=0.50 0 0.43 0.36 0.38 0.43 0.36 0.49 0.47 0.45 0.53 0.49 p=0.25 0 0.34 0.53 0.55 0.55 0.62 0.55 0.49 0.51 0.53 0.49 p=0.10 0 0.09 0.19 0.32 0.34 0.42 0.45 0.47 0.51 0.47 0.53

A Birth/Death Process

•  state 0 1 \lambda p 0 \mu 0 p
•  state p_(-1) 0 1 p_2 \lambda 0 p 0 0 \mu 0 0 p 0
• P_j(t + dt) = P_j(t) + ( \lambda_(j-1) P_(j-1) (t) + \mu_(j+1) P_(j+1) (t) - (\lambda_j + \mu_j) P_j (t) ) dt
• Steady state: \lambda_(j-1) P_(j-1) + \mu_(j+1) P_(j+1) =( \lambda_j + \mu_j) P_j
• j = 0: \mu_1 p_1 = \lambda_0 p_0
• j = 1: \lambda_0 p_0 = \mu_1 p_1
• p_1 = (\lambda_0 / \mu_1) p_0 = p_0
• p_0 + p_1 = 1, therefore p_0 = p_1 = 1/2
• Exercise for the reader: What do you do to move the probabilities away from 1/2?
• Exercise for the reader: Replicate the calculation for 3 and 4 states.

Analytic Queueing Theory

New Concepts

Stochastic process

The example above is a stochastic process. Why?

Markov process

The example above is a Markov process. Why?

Birth-death processes

The example above is a Birth-death process. Why?

A special process where only transitions to neighbouring states are possible That is, if we are in S(j) then the next state can be

1. S(j-1), a death occurs.

Death rate \mu_j.

2. S(j+1), a birth occurs

Birth rate \lambda_j.

3. S(j), neither a birth nor a death occurs, or both a birth and death occur.

We now want to examine what happens, in a birth-death process, in the short time between t and t + \Delta t. Use the Poisson distribution.

1. P(exactly one birth | n(t) = j ) = (\lambda_j \Delta t) * exp(-\lambda_j \Delta t) = \lambda_j \Delta t + terms that are quadratic or higher order in \Delta t
• Write `terms that are quadratic or higher order in \Delta t' as o( (\Delta t)^2 )
• lim_x->0 o( x^2 ) / x = 0
2. P(exactly one death | n(t) = j ) = \mu_j \Delta t + o( (\Delta t)^2 )
3. P(exactly zero births | n(t) = j ) = exp(-\lambda_j \Delta t) = 1 - \lambda \Delta t + o( (\Delta t)^2 )
4. P(exactly zero deaths | n(t) = j ) = 1 - \mu_j \Delta t + o( (\Delta t)^2 )
5. P(more than one birth and/or death) = o( (\Delta t)^2 )

Exercise for the reader. Explain how it is that this is equivalent to only Arrival and Departure events affecting the system state.

What happens between t and \Delta t ? Four possibilities

1. Exactly one birth
• j-1 -> j
• P(n(t+\Delta t) = j AND n(t) = j-1) = P(n(t+\Delta t) = j | n(t) = j-1)P(n(t) = j-1) = (\lambda_(j-1) \Delta t + o(\Delta t))P(n(t) = j-1)
• Abbreviate the notation P(n(t) = j) = pj(t) (This is the notation used in the text.
• P(n(t+\Delta t) = j AND n(t) = j-1) = (\lambda_(j-1) \Delta t) p_j-1(t)
2. Exactly one death
• j+1 -> j
• P(n(t+\Delta t) = j AND n(t) = j+1) = (\mu_(j+1) \Delta t) p_(j+1)(t)
3. Neither a birth nor a death
• j -> j
• P(n(t+\Delta t) = j AND n(t) = j) = 1 - (\lambda_j + \mu_j) p_j(t)
4. Something else
• Nothing. Why? Add P(1) + P(2) + P(3)

The book's notation: P(n(t) = j) = pj(t)

Then

• p_j(t+dt) = p_j(t) + ( \lambda_j-1 p_j-1(t) + \mu_j+1 p_j+1(t) - (\lambda_j + \mu_j) p_j(t) )dt + o(dt)
• (p_j(t+dt) - p_j(t) ) / dt = \lambda_j-1 p_j-1(t) + \mu_j+1 p_j+1(t) - (\lambda_j + \mu_j) p_j(t) + o(dt) / dt
• In the limit dt -> 0 dp_j(t) / dt = \lambda_j-1 p_j-1(t) + \mu_j+1 p_j+1(t) - (\lambda_j + \mu_j) p_j(t)

This equation looks as though you could solve it!

Definition of steady state: dp_j(t) / dt = 0.

Then

• \lambda_j-1 p_j-1(t) + \mu_j+1 p_j+1(t) - (\lambda_j + \mu_j) p_j(t) = 0

Solve this iteratively

1. l0 p0 = m1 p1 => p1 = (l0/m1) p0
2. (l1 + m1) p1 = l0 p0 + m2 p2 => (l1 + m1) (l0/m1) p0 = l0 p0 + m2 p2 => p2 = p0 (l1*l0 / m1*m0)
3. ...
4. pn = p0 (l_n-1*...*l1*l0 / m_n-1 *...*m1*m0)
5. Set p0 by the condition sum_n pn = 1.

This looks like you could solve it further, but you can't. How could you possibly solve the differential equation above, in that case?

What do we do? Try simplified examples.

The Simplest Example M/M/1

First M: Markovian (Exponential) birth (interarrival) times

Second M: Markovian (Exponential) life/death (service) times

1: one server

Assumptions:

1. \lambda_j = \lambda
2. \mu_j = \mu
3. Define r = \lambda / \mu

Then

1. pn = r^n p0
2. p0 = 1 / (1 + r + r^2 + ...) = 1 - r.
3. pn = (1 - r) * r^n.

The mean number of jobs in the system is E(r) = sum_n n * (1 - r ) * r^n = r / (1 - r)

• Goes to infinity as r -> 1. Why?

Little's law

• Mean response time: E(r) = (1/\lambda) * E(n) = 1 / (\mu - \lambda)
• Goes to infinity as \lambda -> \mu from below. Why?
• What happens when \lambda > \mu?