CS457 - System Performance Evaluation - Winter 2008


1. Textbook


Lecture 3 - Performance Metrics

Topics in Performance Evaluation

  1. Abstract: based on objectives
  2. Design: what to measure

    Design also applies to simulation and to modelling.

  3. Measure: good measurements produce robust data
  4. Analyse: data requires objective analysis to produce results. Because
  5. Simulate: results require surface explanations
  6. Model: surface explanations sometimes demand deep explanations

Creating an Abstraction

Section 2.2: 1-5 & 7

Two example systems

  1. RPC versus Remote pipes
  2. E-commerce server
Stage RPC E-commerce
1. Define Goals, System
  • goals first
  • minimize system size
Write a paper comparing RPC to Remote Pipes
  • System should include overhead.
  • System should not include computation
Ensure that a system architecture is scalable.

Find the right size for an expected workload

2. List services
  • possible outcomes for each service
  • including failures
Data transfer (why?)
  • large transfers
  • small transfers
  • failures omitted (why?)
Catalogue updates

Client requests

  • etc.
3. List metrics
  • speed
  • accuracy
  • availability

Metrics

Sequence of events for a single request

  1. System receives request, ai
  2. System starts executing the request - reaction time
  3. System starts responding to request - response time 1
  4. System completes response to request, di - response time 2, ri = di - ai

Types of metrics

  1. Speed
  2. Accuracy
  3. Availability

Return to: