A Self-Calibrating, Vision-Based Navigation Assistant

Abstract : We describe a body-worn sensor suite, environment representation, set of algorithms, and graphical-aural interface designed to provide human-centered guidance to a person moving through a complex space. The central idea underlying our approach is to model the environment as a graph of visually distinctive places (graph nodes) connected by path segments (graph edges). During exploration, our algorithm processes multiple video-rate inputs to identify visual features and construct the “place graph” representation of the traversed space. The system then provides visual and/or spoken guidance in user-centered terms to lead the user along existing or newly-synthesized paths. Our approach is novel in several respects: it requires no precise calibration of the cameras or multi-camera rig used; it generalizes to any number of cameras with any placement on the body; it learns the correlation between user motion and evolution of image features; it constructs the place graph automatically; and it provides only coarse (rather than precise metrical) guidance to the user. We present an experimental study of our methods applied to walking routes through both indoor and outdoor environments, and show that the system provides accurate localization and effective navigation guidance.
Type de document :
Communication dans un congrès
Workshop on Computer Vision Applications for the Visually Impaired, Oct 2008, Marseille, France. 2008
Liste complète des métadonnées

Littérature citée [16 références]  Voir  Masquer  Télécharger

Contributeur : Peter Sturm <>
Soumis le : lundi 29 septembre 2008 - 12:03:28
Dernière modification le : lundi 29 septembre 2008 - 12:04:51
Document(s) archivé(s) le : jeudi 3 juin 2010 - 22:08:36


Fichiers produits par l'(les) auteur(s)


  • HAL Id : inria-00325434, version 1



Olivier Koch, Seth Teller. A Self-Calibrating, Vision-Based Navigation Assistant. Workshop on Computer Vision Applications for the Visually Impaired, Oct 2008, Marseille, France. 2008. 〈inria-00325434〉



Consultations de la notice


Téléchargements de fichiers