audeosynth: music-driven video montage
Zicheng Liao, Yizhou Yu, Bingchen Gong, Lechao Cheng
In ACM Transactions on Graphics (TOG), 34(4), August 2015.
Abstract: We introduce music-driven video montage, a media format that offers a pleasant way to browse or summarize video clips collected from various occasions, including gatherings and adventures. In music-driven video montage, the music drives the composition of the video content. According to musical movement and beats, video clips are organized to form a montage that visually reflects the experiential properties of the music. Nonetheless, it takes enormous manual work and artistic expertise to create it. In this paper, we develop a framework for automatically generating music-driven video montages. The input is a set of video clips and a piece of background music. By analyzing the music and video content, our system extracts carefully designed temporal features from the input, and casts the synthesis problem as an optimization and solves the parameters through Markov Chain Monte Carlo sampling. The output is a video montage whose visual activities are cut and synchronized with the rhythm of the music, rendering a symphony of audio-visual resonance.
Article URL: http://doi.acm.org/10.1145/2766966
BibTeX format:
@article{10.1145-2766966,
  author = {Zicheng Liao and Yizhou Yu and Bingchen Gong and Lechao Cheng},
  title = {audeosynth: music-driven video montage},
  journal = {ACM Transactions on Graphics (TOG)},
  volume = {34},
  number = {4},
  articleno = {68},
  month = aug,
  year = {2015},
}
Search for more articles by Zicheng Liao.
Search for more articles by Yizhou Yu.
Search for more articles by Bingchen Gong.
Search for more articles by Lechao Cheng.

Return to the search page.


graphbib: Powered by "bibsql" and "SQLite3."