Bimodal perception of audio-visual material properties for virtual environments

Abstract : High-quality rendering of both audio and visual material properties is very important in interactive virtual environments, since convincingly rendered materials increase realism and the sense of immersion. We studied how the level of detail of auditory and visual stimuli interact in the perception of audio-visual material rendering quality. Our study is based on perception of material discrimination, when varying the levels of detail of modal synthesis for sound, and Bidirectional Reflectance Distribution Functions for graphics. We performed an experiment for two different models (a Dragon and a Bunny model) and two material types (Plastic and Gold). The results show a significant interaction between auditory and visual level of detail in the perception of material similarity, when comparing approximate levels of detail to a high-quality audio-visual reference rendering. We show how this result can contribute to significant savings in computation time in an interactive audio-visual rendering system. To our knowledge this is the first study which shows interaction of audio and graphics representation in a material perception task.
Type de document :
[Research Report] RR-6687, INRIA. 2008, pp.18
Liste complète des métadonnées
Contributeur : George Drettakis <>
Soumis le : mercredi 22 octobre 2008 - 19:05:54
Dernière modification le : vendredi 4 janvier 2019 - 17:33:03
Document(s) archivé(s) le : lundi 7 juin 2010 - 21:20:39


Fichiers produits par l'(les) auteur(s)


  • HAL Id : inria-00333266, version 1


Nicolas Bonneel, Clara Suied, Isabelle Viaud-Delmon, George Drettakis. Bimodal perception of audio-visual material properties for virtual environments. [Research Report] RR-6687, INRIA. 2008, pp.18. 〈inria-00333266〉



Consultations de la notice


Téléchargements de fichiers