Ynnerman, Anders

2006 (English)In: The 4th International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, 2006 Kuala Lumpur, Malaysia, 2006, p. 341-347Conference paper, Published paper (Refereed)

Abstract [en]

We present a novel technique for capturing spatially and temporally resolved light probe sequences, and using them for rendering. For this purpose we have designed and built a Real Time Light Probe; a catadioptric imaging system that can capture the full dynamic range of the lighting incident at each point in space at video frame rates, while being moved through a scene. The Real Time Light Probe uses a digital imaging system which we have programmed to capture high quality, photometrically accurate color images with a dynamic range of 10,000,000:1 at 25 frames per second.

By tracking the position and orientation of the light probe, it is possible to transform each light probe into a common frame of reference in world coordinates, and map each point in space along the path of motion to a particular frame in the light probe sequence. We demonstrate our technique by rendering synthetic objects illuminated by complex real world lighting, using both traditional image based lighting methods with temporally varying light probe illumination and an extension to handle spatially varying lighting conditions across large objects.

Abstract [en]

Image based lighting, (IBL), is a computer graphics technique for creating photorealistic renderings of synthetic objects such that they can be placed into real world scenes. IBL has been widely recognized and is today used in commercial production pipelines. However, the current techniques only use illumination captured at a single point in space. This means that traditional IBL cannot capture or recreate effects such as cast shadows, shafts of light or other important spatial variations in the illumination. Such lighting effects are, in many cases, artistically created or are there to emphasize certain features, and are therefore a very important part of the visual appearance of a scene.

This thesis and the included papers present methods that extend IBL to allow for capture and rendering with spatially varying illumination. This is accomplished by measuring the light field incident onto a region in space, called an Incident Light Field, (ILF), and using it as illumination in renderings. This requires the illumination to be captured at a large number of points in space instead of just one. The complexity of the capture methods and rendering algorithms are then significantly increased.

The technique for measuring spatially varying illumination in real scenes is based on capture of High Dynamic Range, (HDR), image sequences. For efficient measurement, the image capture is performed at video frame rates. The captured illumination information in the image sequences is processed such that it can be used in computer graphics rendering. By extracting high intensity regions from the captured data and representing them separately, this thesis also describes a technique for increasing rendering efficiency and methods for editing the captured illumination, for example artificially moving or turning on and of individual light sources.