Image-Based Rendering for Non-Diffuse Synthetic Scenes
Dani Lischinski, Ari Rappoport
Eurographics Rendering Workshop, June 1998, pp. 301--314.
Abstract: Most current image-based rendering methods operate under the assumption that all of the visible surfaces in the scene are opaque ideal diffuse (Lambertian) reflectors. This paper is concerned with image-based rendering of non-diffuse synthetic scenes. We introduce a new family of image-based scene representations and describe corresponding image-based rendering algorithms that are capable of handling general synthetic scenes containing not only diffuse reflectors, but also specular and glossy objects. Our image-based representation is based on layered depth images. It represents simultaneously and separately both view-independent scene information and view-dependent appearance information. The view-dependent information may be either extracted directly from our data-structures, or evaluated procedurally using an image-based analogue of ray tracing. We describe image-based rendering algorithms that recombine the two components toaether in a manner that produces a good approximation to the correct image from any viewing position. In addition to extending image-based rendering to non-diffuse synthetic scenes, our paper has an important methodological contribution: it places image-based rendering, light field rendering, and volume graphics in a common framework of discrete raster-based scene representations.
@inproceedings{Lischinski:1998:IRF,
author = {Dani Lischinski and Ari Rappoport},
title = {Image-Based Rendering for Non-Diffuse Synthetic Scenes},
booktitle = {Eurographics Rendering Workshop},
pages = {301--314},
month = jun,
year = {1998},
}
Return to the search page.
graphbib: Powered by "bibsql" and "SQLite3."