Skip to Main content Skip to Navigation
Conference papers

Accounting for Room Acoustics in Audio-Visual Multi-Speaker Tracking

Yutong Ban 1 Xiaofei Li 1 Xavier Alameda-Pineda 1 Laurent Girin 2, 1 Radu Horaud 1
1 PERCEPTION - Interpretation and Modelling of Images and Videos
Inria Grenoble - Rhône-Alpes, Grenoble INP - Institut polytechnique de Grenoble - Grenoble Institute of Technology, LJK - Laboratoire Jean Kuntzmann
Abstract : Multiple-speaker tracking is a crucial task for many applications. In real-world scenarios, exploiting the complementarity between auditory and visual data enables to track people outside the visual field of view. However, practical methods must be robust to changes in acoustic conditions, e.g. reverberation. We investigate how to combine state-of-the-art audio-source localization techniques with Bayesian multi-person tracking. Our experiments demonstrate that the performance of the proposed system is not affected by changes in the acoustic environment.
Complete list of metadata

Cited literature [20 references]  Display  Hide  Download
Contributor : Perception Team Connect in order to contact the contributor
Submitted on : Tuesday, February 27, 2018 - 10:34:57 AM
Last modification on : Wednesday, November 3, 2021 - 5:10:35 AM
Long-term archiving on: : Monday, May 28, 2018 - 5:05:09 PM


Files produced by the author(s)



Yutong Ban, Xiaofei Li, Xavier Alameda-Pineda, Laurent Girin, Radu Horaud. Accounting for Room Acoustics in Audio-Visual Multi-Speaker Tracking. ICASSP 2018 - IEEE International Conference on Acoustics, Speech and Signal Processing, Apr 2018, Calgary, Alberta, Canada. pp.6553-6557, ⟨10.1109/ICASSP.2018.8462100⟩. ⟨hal-01718114⟩



Les métriques sont temporairement indisponibles