Responsive action generation by physically-based motion retrieval and adaptation

Xiubo Liang 1, 2 Ludovic Hoyet 3 Weidong Geng 1 Franck Multon 2, 3
3 BUNRAKU - Perception, decision and action of real and virtual humans in virtual environments and impact on real environments
IRISA - Institut de Recherche en Informatique et Systèmes Aléatoires, ENS Cachan - École normale supérieure - Cachan, Inria Rennes – Bretagne Atlantique
Abstract : Responsive motion generation of avatars who have physical interactions with their environment is a key issue in VR and video games. We present a performance-driven avatar control interface with physically-based motion retrieval. When the interaction between the user-controlled avatar and its environment is going to happen, the avatar has to select the motion clip that satisfies both kinematic and dynamic constraints. A two-steps process is proposed. Firstly, it selects a set of candidate motions according to the performance of the user. Secondly, these candidate motions are further ranked according to their capability to satisfy dynamic constraints such as balance and comfort. The motion associated with the highest score is finally adapted in order to accurately satisfy the kinematic constraints imposed by the virtual world. The experimental results show that it can efficiently control the avatar with an intuitive performance-based interface based on few motion sensors.
Type de document :
Communication dans un congrès
Motion in Games 2010, Nov 2010, Zeist, Netherlands. 2010
Liste complète des métadonnées

https://hal.inria.fr/inria-00536009
Contributeur : Ludovic Hoyet <>
Soumis le : lundi 15 novembre 2010 - 08:03:51
Dernière modification le : mercredi 16 mai 2018 - 11:23:18

Identifiants

  • HAL Id : inria-00536009, version 1

Citation

Xiubo Liang, Ludovic Hoyet, Weidong Geng, Franck Multon. Responsive action generation by physically-based motion retrieval and adaptation. Motion in Games 2010, Nov 2010, Zeist, Netherlands. 2010. 〈inria-00536009〉

Partager

Métriques

Consultations de la notice

667