Perceptually Guided High-Fidelity Rendering Exploiting Movement Bias in Visual Attention
Jasminka Hasic, Alan Chalmers, Elena Sikudova
In ACM Transactions on Applied Perception, 8(1), October 2010.
Abstract: A major obstacle for real-time rendering of high-fidelity graphics is computational complexity. A key point to consider in the pursuit of "realism in real time" in computer graphics is that the Human Visual System (HVS) is a fundamental part of the rendering pipeline. The human eye is only capable of sensing image detail in a 2$deg $ foveal region, relying on rapid eye movements, or saccades, to jump between points of interest. These points of interest are prioritized based on the saliency of the objects in the scene or the task the user is performing. Such "glimpses" of a scene are then assembled by the HVS into a coherent, but inevitably imperfect, visual perception of the environment. In this process, much detail, that the HVS deems unimportant, may literally go unnoticed.Visual science research has identified that movement in the background of a scene may substantially influence how subjects perceive foreground objects. Furthermore, recent computer graphics work has shown that both fixed viewpoint and dynamic scenes can be selectively rendered without any perceptual loss of quality, in a significantly reduced time, by exploiting knowledge of any high-saliency movement that may be present. A high-saliency movement can be generated in a scene if an otherwise static objects starts moving. In this article, we investigate, through psychophysical experiments, including eye-tracking, the perception of rendering quality in dynamic complex scenes based on the introduction of a moving object in a scene. Two types of object movement are investigated: (i) rotation in place and (ii) rotation combined with translation. These were chosen as the simplest movement types. Future studies may include movement with varied acceleration. The object's geometry and location in the scene are not salient. We then use this information to guide our high-fidelity selective renderer to produce perceptually high-quality images at significantly reduced computation times. We also show how these results can have important implications for virtual environment and computer games applications.
Keyword(s): Movement saliency, attention, perception, saliency map, selective rendering
Article URL: http://doi.acm.org/10.1145/1857893.1857899
BibTeX format:
@article{Hasic:2010:PGH,
  author = {Jasminka Hasic and Alan Chalmers and Elena Sikudova},
  title = {Perceptually Guided High-Fidelity Rendering Exploiting Movement Bias in Visual Attention},
  journal = {ACM Transactions on Applied Perception},
  volume = {8},
  number = {1},
  pages = {6:1--6:19},
  month = oct,
  year = {2010},
}
Search for more articles by Jasminka Hasic.
Search for more articles by Alan Chalmers.
Search for more articles by Elena Sikudova.

Return to the search page.


graphbib: Powered by "bibsql" and "SQLite3."