Global Illumination

Calculating illumination

The Light Field

Plenoptic Function

Think about what the viewer can do.

1. The seriously handicapped viewer can
• not move in position
• not move the direction of gaze

Ray tracing is perfect.

2. The mildly handicapped viewer can
• not move in position
• gaze in any direction

Ray trace onto a sphere surrounding the viewer and reproject from the sphere to a view plane whenever the direction of gaze changes.

3. The unhandicapped viewer can
• move around
• gaze in any direction

Ray trace onto a sphere at each accessible point.

The third is the light field, also called the plenoptic function, and it has to be recalculated every time something in the scene moves.

Filling Space with Light

Let's turn our attention away from the surfaces of objects and onto the volume between objects

At every point in this volume there is a light density

• for every possible direction
• for every visible wavelength

This quantity LF(P, <z>, \lambda ) is the light field. If we knew it we could

• evaluate it at the eye position
• at the angle heading for each pixel
• to get RGB for that pixel

The evaluation is, in fact, just a projective transformation of the light field.

How do we get the light field?

1. by measurement
2. by calculation
• Radiosity is the obvious method

How is the light field used in 2009?

• routine applications for backdrops
• Think about a window in a dark room
• Light passes only one direction
• What's wrong with treating a window like a 2D scene on the wall?
• Easy to do by texture mapping
• How would we get the necessary data?
• calculation
• measurement
• remote controlled digital camera
• still the problems of storage and reconstruction
• yesterday's excitement

But tomorrow!!

Backdrop' Applications

Imagine making a game or a movie

• There is an area accessible to the players (actors, camera), and
• there is an area inaccessible to the players (actors, camera).

An easy backdrop

• Surround the accessible volume with a sphere (actually a hemi-sphere)
• Ray trace the scene outside the accessible volume onto the sphere
• Put the re-projected portion of the sphere into the frame buffer, depth buffer set to infinity
• Where is the eye point?
• The centre of the sphere works for the mildly handicapped viewer.
• What is missing for the unhandicapped viewer?
• How do you make certain that artifacts are not visible?
• For a normal backdrop, three volumes
1. The smallest one for user position
2. A surrounding one that is 3D modelled.
3. The remainder, which is done as a normal backdrop, and moves with the user
• For a plenoptic backdrop, two volumes
1. One for user motion
2. The remainder, which is a plenoptic backdrop, which doesn't move with the user
• Sizes determined perceptually
• threshold of perceptability of motion parallax
• threshold of perceptability for object rotation

Other Phenomena at Surfaces

How does reflection actually work?

The key concept is the index of refraction

• measure of the speed of light in a substance
• speed of light determines refraction and reflection angles (drawing)
• note that it's the angle to the normal that gives Snell's Law.

Reflected and refracted rays

• How much goes into each of the reflected and refracted rays?
• depends on indices of refraction
• How?
• Reflected/Transmitted = (...) / (... (1 - (\alpha sin \theta)^2 ) )
• \alpha = n(in) / n(out)
• \theta = angle of incidence
• Note: at | \alpha sin \theta | = 1 the reflected/transmitted goes to infinity.
• sin \theta = 1 / \alpha
• Only occurs when \alpha > 1.
• index of refraction of incoming > index of refraction of outgoing
• This is the critical angle, beyond which all light is reflected
• Brewster's angle is something different

Subsurface Scattering

• Why did the light come out of the surface at the location where it entered?
• It didn't.
• Why doesn't it matter?
• Try translating the surface

Partitioned rendering reminder.

• When translational invariance is missing
• structure in surface
• structure in light

then you have to think about how light moves inside the surface.

A general formulation

• If light of wavelength \lambda enters at x, it emerges at x' with probability R(x', x, \lambda)
• Therefore, light emerging at x' is \sum_x R(x', x, \lambda) L(x, \lambda)
• Critical question
• How wide' is R(x', x, \lambda)?
• This tells you when subsurface scattering will make a difference.

Bidirectional Reflectance Function

BRDF as an example of partitioned rendering

Examples:

1. Surface of CD
2. Some fabrics
3. Desert sand
4. Recently cut grass

Where do BRDFs come from?

1. Extensive measurement
2. Micromodelling

Modelling

Examples of Micromodelling

Human skin

• skin structure
• keratin, melanin, blood in different proportions
• place to place on the body
• person to person
• time to time
• We could let the rays go through the skin and interact with the pigments,

or we could summarize everything in a parametrized reflectance function

• R(x, \lambda) = R(k(x), m(x), b(x), \lambda)

Obviously the second strategy is better,

• but only if it works
• Select a model, work out a reflectance function, check that it agrees with reality

Note two different definitions of `agrees with reality'.