GPU-based rendering for deformable translucent objects
Yi Gong, Wei Chen, Long Zhang, Yun Zeng, Qunsheng Peng
In The Visual Computer, 24(2), February 2008.
Abstract: In this paper we introduce an approximate image-space approach for real-time rendering of deformable translucent models by flattening the geometry and lighting information of objects into textures to calculate multi-scattering in texture spaces. We decompose the process into two stages, called the gathering and scattering corresponding to the computations for incident and exident irradiance respectively. We derive a simplified illumination model for the gathering of the incident irradiance, which is amenable for deformable models using two auxiliary textures. In the scattering stage, we adopt two modes for efficient accomplishment of the view-dependent scattering. Our approach is implemented by fully exploiting the capabilities of graphics processing units (GPUs). It achieves visually plausible results and real-time frame rates for deformable models on commodity desktop PCs.
Keyword(s): Sub-surface scattering, BSSRDF, Translucency, Real-time rendering
@article{Gong:2008:GRF,
author = {Yi Gong and Wei Chen and Long Zhang and Yun Zeng and Qunsheng Peng},
title = {GPU-based rendering for deformable translucent objects},
journal = {The Visual Computer},
volume = {24},
number = {2},
pages = {95--103},
month = feb,
year = {2008},
}
Return to the search page.
graphbib: Powered by "bibsql" and "SQLite3."