Realtime performance-based facial animation
Thibaut Weise, Sofien Bouaziz, Hao Li, Mark Pauly
In ACM Transactions on Graphics, 30(4), July 2011.
Abstract: This paper presents a system for performance-based character animation that enables any user to control the facial expressions of a digital avatar in realtime. The user is recorded in a natural environment using a non-intrusive, commercially available 3D sensor. The simplicity of this acquisition device comes at the cost of high noise levels in the acquired data. To effectively map low-quality 2D images and 3D depth maps to realistic facial expressions, we introduce a novel face tracking algorithm that combines geometry and texture registration with pre-recorded animation priors in a single optimization. Formulated as a maximum a posteriori estimation in a reduced parameter space, our method implicitly exploits temporal coherence to stabilize the tracking. We demonstrate that compelling 3D facial dynamics can be reconstructed in realtime without the use of face markers, intrusive lighting, or complex scanning hardware. This makes our system easy to deploy and facilitates a range of new applications, e.g. in digital gameplay or social interactions.
Keyword(s): blendshape animation, face animation, markerless performance capture, real-time tracking
@article{Weise:2011:RPF,
author = {Thibaut Weise and Sofien Bouaziz and Hao Li and Mark Pauly},
title = {Realtime performance-based facial animation},
journal = {ACM Transactions on Graphics},
volume = {30},
number = {4},
pages = {77:1--77:10},
month = jul,
year = {2011},
}
Return to the search page.
graphbib: Powered by "bibsql" and "SQLite3."