Skip to Main content Skip to Navigation
Reports

Bimodal perception of audio-visual material properties for virtual environments

Abstract : High-quality rendering of both audio and visual material properties is very important in interactive virtual environments, since convincingly rendered materials increase realism and the sense of immersion. We studied how the level of detail of auditory and visual stimuli interact in the perception of audio-visual material rendering quality. Our study is based on perception of material discrimination, when varying the levels of detail of modal synthesis for sound, and Bidirectional Reflectance Distribution Functions for graphics. We performed an experiment for two different models (a Dragon and a Bunny model) and two material types (Plastic and Gold). The results show a significant interaction between auditory and visual level of detail in the perception of material similarity, when comparing approximate levels of detail to a high-quality audio-visual reference rendering. We show how this result can contribute to significant savings in computation time in an interactive audio-visual rendering system. To our knowledge this is the first study which shows interaction of audio and graphics representation in a material perception task.
Document type :
Reports
Complete list of metadata

https://hal.inria.fr/inria-00333266
Contributor : George Drettakis <>
Submitted on : Wednesday, October 22, 2008 - 7:05:54 PM
Last modification on : Wednesday, December 9, 2020 - 3:09:21 PM
Long-term archiving on: : Monday, June 7, 2010 - 9:20:39 PM

File

RR-6687.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : inria-00333266, version 1

Citation

Nicolas Bonneel, Clara Suied, Isabelle Viaud-Delmon, George Drettakis. Bimodal perception of audio-visual material properties for virtual environments. [Research Report] RR-6687, INRIA. 2008, pp.18. ⟨inria-00333266⟩

Share

Metrics

Record views

389

Files downloads

559