Enhanced personal autostereoscopic telepresence system using commodity depth cameras
Andrew Maimone, Jonathan Bidwell, Kun Peng, Henry Fuchs
In Computers & Graphics, 36(7), 2012.
Abstract: This paper describes an enhanced telepresence system that offers fully dynamic, real-time 3D scene capture and continuous-viewpoint, head-tracked stereo 3D display without requiring the user to wear any tracking or viewing apparatus. We present a complete software and hardware framework for implementing the system, which is based on an array of commodity Microsoft Kinect color-plus-depth cameras. Contributions include an algorithm for merging data between multiple depth cameras and techniques for automatic color calibration and preserving stereo quality even with low rendering rates. Also presented is a solution to the problem of interference that occurs between Kinect cameras with overlapping views. Emphasis is placed on a fully GPU-accelerated data processing and rendering pipeline that can apply hole filling, smoothing, data merger, surface generation, and color correction at rates of up to 200 million triangles/s on a single PC and graphics board. Also presented is a Kinect-based markerless tracking system that combines 2D eye recognition with depth information to allow head-tracked stereo views to be rendered for a parallax barrier autostereoscopic display. Enhancements in calibration, filtering, and data merger were made to improve image quality over a previous version of the system.
Keyword(s): Teleconferencing, Sensor fusion, Camera calibration, Color calibration, Filtering, Tracking
@article{Maimone:2012:EPA,
author = {Andrew Maimone and Jonathan Bidwell and Kun Peng and Henry Fuchs},
title = {Enhanced personal autostereoscopic telepresence system using commodity depth cameras},
journal = {Computers & Graphics},
volume = {36},
number = {7},
pages = {791--807},
year = {2012},
}
Return to the search page.
graphbib: Powered by "bibsql" and "SQLite3."