Spatially Varying Image Based Lighting using HDR-video

Illumination is one of the key components in the creation of realistic renderings of scenes containing virtual objects.
In this paper, we present a set of novel algorithms and data structures for visualization, processing and rendering with
real world lighting conditions captured using High Dynamic Range (HDR) video. The presented algorithms enable rapid construction
of general and editable representations of the lighting environment, as well as extraction and fitting of sampled reflectance
to parametric BRDF models. For efficient representation and rendering of the sampled lighting environment function, we consider
an adaptive (2D/4D) data structure for storage of light field data on proxy geometry describing the scene. To demonstrate
the usefulness of the algorithms, they are presented in the context of a fully integrated framework for spatially varying
image based lighting. We show reconstructions of example scenes and resulting production quality renderings of virtual furniture
with spatially varying real world illumination including occlusions.