Skip to Main content Skip to Navigation

Bimodal perception of audio-visual material properties for virtual environments

Abstract : High-quality rendering of both audio and visual material properties is very important in interactive virtual environments, since convincingly rendered materials increase realism and the sense of immersion. We studied how the level of detail of auditory and visual stimuli interact in the perception of audio-visual material rendering quality. Our study is based on perception of material discrimination, when varying the levels of detail of modal synthesis for sound, and Bidirectional Reflectance Distribution Functions for graphics. We performed an experiment for two different models (a Dragon and a Bunny model) and two material types (Plastic and Gold). The results show a significant interaction between auditory and visual level of detail in the perception of material similarity, when comparing approximate levels of detail to a high-quality audio-visual reference rendering. We show how this result can contribute to significant savings in computation time in an interactive audio-visual rendering system. To our knowledge this is the first study which shows interaction of audio and graphics representation in a material perception task.
Document type :
Complete list of metadata
Contributor : George Drettakis Connect in order to contact the contributor
Submitted on : Wednesday, October 22, 2008 - 7:05:54 PM
Last modification on : Friday, January 21, 2022 - 4:13:15 AM
Long-term archiving on: : Monday, June 7, 2010 - 9:20:39 PM


Files produced by the author(s)


  • HAL Id : inria-00333266, version 1


Nicolas Bonneel, Clara Suied, Isabelle Viaud-Delmon, George Drettakis. Bimodal perception of audio-visual material properties for virtual environments. [Research Report] RR-6687, INRIA. 2008, pp.18. ⟨inria-00333266⟩



Record views


Files downloads