Gesture Controllers
Sergey Levine, Philipp Krähenbühl, Sebastian Thrun, Vladlen Koltun
In ACM Transactions on Graphics, 29(4), July 2010.
Abstract: We introduce gesture controllers, a method for animating the body language of avatars engaged in live spoken conversation. A gesture controller is an optimal-policy controller that schedules gesture animations in real time based on acoustic features in the user's speech. The controller consists of an inference layer, which infers a distribution over a set of hidden states from the speech signal, and a control layer, which selects the optimal motion based on the inferred state distribution. The inference layer, consisting of a specialized conditional random field, learns the hidden structure in body language style and associates it with acoustic features in speech. The control layer uses reinforcement learning to construct an optimal policy for selecting motion clips from a distribution over the learned hidden states. The modularity of the proposed method allows customization of a character's gesture repertoire, animation of non-human characters, and the use of additional inputs such as speech recognition or direct user control.
Keyword(s): data-driven animation, gesture synthesis, human animation, nonverbal behavior generation, optimal control
Article URL: http://doi.acm.org/10.1145/1778765.1778861
BibTeX format:
@article{Levine:2010:GC,
  author = {Sergey Levine and Philipp Krähenbühl and Sebastian Thrun and Vladlen Koltun},
  title = {Gesture Controllers},
  journal = {ACM Transactions on Graphics},
  volume = {29},
  number = {4},
  pages = {124:1--124:11},
  month = jul,
  year = {2010},
}
Search for more articles by Sergey Levine.
Search for more articles by Philipp Krähenbühl.
Search for more articles by Sebastian Thrun.
Search for more articles by Vladlen Koltun.

Return to the search page.


graphbib: Powered by "bibsql" and "SQLite3."