Real-time and robust hand tracking with a single depth camera
Ziyang Ma, Enhua Wu
In The Visual Computer, 30(10), October 2014.
Abstract: In this paper, we introduce a novel, real-time and robust hand tracking system, capable of tracking the articulated hand motion in full degrees of freedom (DOF) using a single depth camera. Unlike most previous systems, our system is able to initialize and recover from tracking loss automatically. This is achieved through an efficient two-stage k-nearest neighbor database searching method proposed in the paper. It is effective for searching from a pre-rendered database of small hand depth images, designed to provide good initial guesses for model based tracking. We also propose a robust objective function, and improve the Particle Swarm Optimization algorithm with a resampling based strategy in model based tracking. It provides continuous solutions in full DOF hand motion space more efficiently than previous methods. Our system runs at 40 fps on a GeForce GTX 580 GPU and experimental results show that the system outperforms the state-of-the-art model based hand tracking systems in terms of both speed and accuracy. The work result is of significance to various applications in the field of human–computer-interaction and virtual reality.
@article{Ma:2014:RAR,
author = {Ziyang Ma and Enhua Wu},
title = {Real-time and robust hand tracking with a single depth camera},
journal = {The Visual Computer},
volume = {30},
number = {10},
pages = {1133--1144},
month = oct,
year = {2014},
}
Return to the search page.
graphbib: Powered by "bibsql" and "SQLite3."