Responsive action generation by physically-based motion retrieval and adaptation

Xiubo Liang 1, 2 Ludovic Hoyet 3 Weidong Geng 1 Franck Multon 2, 3
3 BUNRAKU - Perception, decision and action of real and virtual humans in virtual environments and impact on real environments
IRISA - Institut de Recherche en Informatique et Systèmes Aléatoires, ENS Cachan - École normale supérieure - Cachan, Inria Rennes – Bretagne Atlantique
Abstract : Responsive motion generation of avatars who have physical interactions with their environment is a key issue in VR and video games. We present a performance-driven avatar control interface with physically-based motion retrieval. When the interaction between the user-controlled avatar and its environment is going to happen, the avatar has to select the motion clip that satisfies both kinematic and dynamic constraints. A two-steps process is proposed. Firstly, it selects a set of candidate motions according to the performance of the user. Secondly, these candidate motions are further ranked according to their capability to satisfy dynamic constraints such as balance and comfort. The motion associated with the highest score is finally adapted in order to accurately satisfy the kinematic constraints imposed by the virtual world. The experimental results show that it can efficiently control the avatar with an intuitive performance-based interface based on few motion sensors.
Document type :
Conference papers
Complete list of metadatas

https://hal.inria.fr/inria-00536009
Contributor : Ludovic Hoyet <>
Submitted on : Monday, November 15, 2010 - 8:03:51 AM
Last modification on : Friday, November 16, 2018 - 1:27:17 AM

Identifiers

  • HAL Id : inria-00536009, version 1

Citation

Xiubo Liang, Ludovic Hoyet, Weidong Geng, Franck Multon. Responsive action generation by physically-based motion retrieval and adaptation. Motion in Games 2010, Nov 2010, Zeist, Netherlands. ⟨inria-00536009⟩

Share

Metrics

Record views

707