Image-based spatio-temporal modeling and view interpolation of dynamic events
Sundar Vedula, Simon Baker, Takeo Kanade
In ACM Transactions on Graphics, 24(2), April 2005.
Abstract: We present an approach for modeling and rendering a dynamic, real-world event from an arbitrary viewpoint, and at any time, using images captured from multiple video cameras. The event is modeled as a nonrigidly varying dynamic scene, captured by many images from different viewpoints, at discrete times. First, the spatio-temporal geometric properties (shape and instantaneous motion) are computed. The view synthesis problem is then solved using a reverse mapping algorithm, ray-casting across space and time, to compute a novel image from any viewpoint in the 4D space of position and time. Results are shown on real-world events captured in the CMU 3D Room, by creating synthetic renderings of the event from novel, arbitrary positions in space and time. Multiple such recreated renderings can be put together to create retimed fly-by movies of the event, with the resulting visual experience richer than that of a regular video clip, or switching between images from multiple cameras.
Keyword(s): Image-based modeling, rendering, dynamic scenes, non-rigid motion,scene flow, space carving, spatio-temporal view interpolation, voxelmodels
@article{Vedula:2005:ISM,
author = {Sundar Vedula and Simon Baker and Takeo Kanade},
title = {Image-based spatio-temporal modeling and view interpolation of dynamic events},
journal = {ACM Transactions on Graphics},
volume = {24},
number = {2},
pages = {240--261},
month = apr,
year = {2005},
}
Return to the search page.
graphbib: Powered by "bibsql" and "SQLite3."