Integration of auditory and visual information in the recognition of realistic objects

Abstract : Recognizing a natural object requires one to pool information from various sensory modalities, and to ignore information from competing objects. That the same semantic knowledge can be accessed through different modalities makes it possible to explore the retrieval of supramodal object concepts. Here, object-recognition processes were investigated by manipulating the relationships between sensory modalities, specifically, semantic content, and spatial alignment between auditory and visual information. Experiments were run under realistic virtual environment. Participants were asked to react as fast as possible to a target object presented in the visual and/or the auditory modality and to inhibit a distractor object (go/nogo task). Spatial alignment had no effect on object-recognition time. The only spatial effect observed was a stimulus-response compatibility between the auditory stimulus and the hand position. Reaction times were significantly shorter for semantically congruent bimodal stimuli than would be predicted by independent processing of information about the auditory and visual targets. Interestingly, this bimodal facilitation effect was twice as large as found in previous studies that also used information- rich stimuli. An interference effect was observed (i.e. longer reaction times to semantically incongruent stimuli than to the corresponding unimodal stimulus) only when the distractor was auditory. When the distractor was visual, the semantic incongruence did not interfere with object recognition. Our results show that immersive displays with large visual stimuli may provide large multimodal integration effects, and reveal a possible asymmetry in the attentional filtering of irrelevant auditory and visual information.
Type de document :
Article dans une revue
Experimental Brain Research, Springer Verlag, 2009, 194 (1), pp.91-102. 〈10.1007/s00221-008-1672-6〉
Liste complète des métadonnées

https://hal.inria.fr/inria-00606822
Contributeur : Team Reves <>
Soumis le : jeudi 7 juillet 2011 - 11:15:07
Dernière modification le : mercredi 21 mars 2018 - 18:57:08

Lien texte intégral

Identifiants

Collections

Citation

Clara Suied, Nicolas Bonneel, Isabelle Viaud-Delmon. Integration of auditory and visual information in the recognition of realistic objects. Experimental Brain Research, Springer Verlag, 2009, 194 (1), pp.91-102. 〈10.1007/s00221-008-1672-6〉. 〈inria-00606822〉

Partager

Métriques

Consultations de la notice

100