3D Position, Attitude and Shape Input Using Video Tracking of Hands and Lips
Andrew Blake, Michael Isard
Proceedings of SIGGRAPH 94, July 1994, pp. 185--192.
Abstract: Recent developments in video-tracking allow the outlines of moving, natural objects in a video-camera input stream to be tracked live, at full video-rate. Previous systems have been available to do this for specially illuminated objects or for naturally illuminated but polyhedral objects. Other systems have been able to track non-polyhedral objects in motion, in some cases from live video, but following only centroids or key-points rather than tracking whole curves. The system described here can track accurately the curved silhouettes of moving non-polyhedral objects at frame-rate, for example hands, lips, legs, vehicles, fruit, and without any special hardware beyond a desktop workstation and a video-camera and framestore. The new algorithms are a synthesis of methods in deformable models, B-spline curve representation and control theory. This paper shows how such a facility can be used to turn parts of the body - for instance, hands and lips - into input devices. Rigid motion of a hand can be used as a 3Dmouse with non-rigid gestures signalling a button press or the "lifting" of the mouse. Both rigid and non-rigid motions of lips can be tracked independently and used as inputs, for example to animate a computer-generated face.
BibTeX format:
@inproceedings{Blake:1994:3PA,
  author = {Andrew Blake and Michael Isard},
  title = {3D Position, Attitude and Shape Input Using Video Tracking of Hands and Lips},
  booktitle = {Proceedings of SIGGRAPH 94},
  pages = {185--192},
  month = jul,
  year = {1994},
}
Search for more articles by Andrew Blake.
Search for more articles by Michael Isard.

Return to the search page.


graphbib: Powered by "bibsql" and "SQLite3."