CS457 - System Performance Evaluation - Winter 2010
Public Service Announcements
- Textbook should be both useful for learning and reference
- Assignment deadlines
Lecture 2 - Importance of Performance Evaluation
Performance Evaluation
Where it Occurs
- CPU design
- Petterson & Hennesey: remember CS251
- Pipelining
- Cache size and configuration
- number of lines versus line length
- System configuration
- Databases for banks and insurance companies
- Your own computer purchasing: memory versus CPU versus GPU
- Network performance
- Protocols
- Mobile networks
- World of Warcraft
Why it Matters
Ultimately for one reason, and one reason only,
- the service provided to human users, who have hard limits
- time
- money
Relationship between performance evaluation and quality of service
(QoS)
- Getting the metrics right is everything.
Its Goals
- Response time
- Average case
- Worst case
- Maybe even best case
- Throughput
- Financial cost
- Power consumption
- Memory footprint
- Small scale systems like mobile telephones
- Large scale systems like gaming-optimized computers
- Clock speed
- related to power consumption
It's possible to take this too far. We restrict ourslelves to aspects of
performance that are easily quantified, measured and modelled.
Topics in Performance Evaluation
- Abstract: based on objectives
- Abstractions are good or bad
- But the good ness and badness is not measurable, at least not
ex ante
- A good abstraction just keeps on giving
- Design: what to measure
- More measurements are only sometimes better
- measuring bricks
- measuring atoms
- What to measure comes back to what the system is designed to
accomplish
Design also applies to simulation and to modelling.
- Measure: good measurements produce robust data
- `robust' means `If you did it again holding only the abstraction
constant you would get the same results.'
- types of measurement
- in the wild
- in an abtracted environment
- Analyse: data requires objective analysis to produce
results. Because
- You are human, and
- Humans are half-blind. (All too often we see what we want to
see.)
- Simulate: results require surface explanations
- Surface explanations solve immediate problems
- Model: surface explanations sometimes demand deep explanations
- Deep explanations actually make you, and those you work with,
smarter.
- Remember that models must be tested.
- There is no `waterfall model' in performance evaluation.
Return to: