Towards BCI-based Interfaces for Augmented Reality: Feasibility, Design and Evaluation

Hakim Si-Mohammed 1 Jimmy Petit 2 Camille Jeunet 1, 3 Ferran Argelaguet 1 Fabien Spindler 4 Andéol Evain 1 Nicolas Roussel 5 Géry Casiez 6 Anatole Lécuyer 1
1 Hybrid - 3D interaction with virtual environments using body and mind
Inria Rennes – Bretagne Atlantique , IRISA_D6 - MEDIA ET INTERACTIONS
4 RAINBOW - Sensor-based and interactive robotics
Inria Rennes – Bretagne Atlantique , IRISA_D5 - SIGNAUX ET IMAGES NUMÉRIQUES, ROBOTIQUE
5 MJOLNIR - Computing tools to empower users
Inria Lille - Nord Europe, CRIStAL - Centre de Recherche en Informatique, Signal et Automatique de Lille (CRIStAL) - UMR 9189
6 LOKI - Technology and knowledge for interaction
Inria Lille - Nord Europe, CRIStAL - Centre de Recherche en Informatique, Signal et Automatique de Lille (CRIStAL) - UMR 9189
Abstract : Brain-Computer Interfaces (BCIs) enable users to interact with computers without any dedicated movement, bringing new hands-free interaction paradigms. In this paper we study the combination of BCI and Augmented Reality (AR). We first tested the feasibility of using BCI in AR settings based on Optical See-Through Head-Mounted Displays (OST-HMDs). Experimental results showed that a BCI and an OST-HMD equipment (EEG headset and Hololens in our case) are well compatible and that small movements of the head can be tolerated when using the BCI. Second, we introduced a design space for command display strategies based on BCI in AR, when exploiting a famous brain pattern called Steady-State Visually Evoked Potential (SSVEP). Our design space relies on five dimensions concerning the visual layout of the BCI menu ; namely: orientation, frame-of-reference, anchorage, size and explicitness. We implemented various BCI-based display strategies and tested them within the context of mobile robot control in AR. Our findings were finally integrated within an operational prototype based on a real mobile robot that is controlled in AR using a BCI and a HoloLens headset. Taken together our results (4 user studies) and our methodology could pave the way to future interaction schemes in Augmented Reality exploiting 3D User Interfaces based on brain activity and BCIs.
Type de document :
Article dans une revue
IEEE Transactions on Visualization and Computer Graphics, Institute of Electrical and Electronics Engineers, 2018, pp.1-12. 〈10.1109/TVCG.2018.2873737〉
Liste complète des métadonnées

https://hal.inria.fr/hal-01947344
Contributeur : Ferran Argelaguet Sanz <>
Soumis le : jeudi 6 décembre 2018 - 16:40:39
Dernière modification le : mercredi 16 janvier 2019 - 11:25:49

Fichier

Manuscript.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

Citation

Hakim Si-Mohammed, Jimmy Petit, Camille Jeunet, Ferran Argelaguet, Fabien Spindler, et al.. Towards BCI-based Interfaces for Augmented Reality: Feasibility, Design and Evaluation. IEEE Transactions on Visualization and Computer Graphics, Institute of Electrical and Electronics Engineers, 2018, pp.1-12. 〈10.1109/TVCG.2018.2873737〉. 〈hal-01947344〉

Partager

Métriques

Consultations de la notice

95

Téléchargements de fichiers

50