Deep signatures for indexing and retrieval in large motion databases
Yingying Wang, Michael Neff
Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games, 2015, pp. 37--45.
Abstract: Data-driven motion research requires effective tools to compress, index, retrieve and reconstruct captured motion data. In this paper, we present a novel method to perform these tasks using a deep learning architecture. Our deep autoencoder, a form of artificial neural network, encodes motion segments into "deep signatures." This signature is formed by concatenating signatures for functionally different parts of the body. The deep signature is a highly condensed representation of a motion segment, requiring only 20 bytes, yet still encoding high level motion features. It can be used to produce a very compact representation of a motion database that can be effectively used for motion indexing and retrieval, with a very small memory footprint. Database searches are reduced to low cost binary comparisons of signatures. Motion reconstruction is achieved by fixing a "deep signature" that is missing a section using Gibbs Sampling. We tested both manually and automatically segmented motion databases and our experiments show that extracting the deep signature is fast and scales well with large databases. Given a query motion, similar motion segments can be retrieved at interactive speed with excellent match quality.
@inproceedings{10.1145-2822013.2822024,
author = {Yingying Wang and Michael Neff},
title = {Deep signatures for indexing and retrieval in large motion databases},
booktitle = {Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games},
pages = {37--45},
year = {2015},
}
Return to the search page.
graphbib: Powered by "bibsql" and "SQLite3."