A motion capture-based control-space approach for walking mannequins

Abstract : Virtual mannequins need to navigate in order to interact with their environment. Their autonomy to accomplish navigation tasks is ensured by locomotion controllers. Control inputs can be user-defined or automatically computed to achieve high-level operations (e.g. obstacle avoidance). This paper presents a locomotion controller based on a motion capture edition technique. Controller inputs are the instantaneous linear and angular velocities of the walk. Our solution works in real time and supports at any time continuous changes of inputs. The controller combines three main components to synthesize locomotion animations in a 4-stage process. First, the Motion Library stores motion capture samples. Motion captures are analysed to compute quantitative characteristics. Second, these characteristics are represented into a linear control space. This geometric representation is appropriate for selecting and weighting three motion samples with respect to the inputs state. Third, locomotion cycles are synthesized by blending the selected motion samples. Blending is done into the frequency domain. Lastly, successive postures are extracted from the synthesized cycles in order to complete the animation of the moving mannequin. The method is demonstrated in this paper in a locomotion-planning context.
Type de document :
Article dans une revue
Computer Animation and Virtual Worlds, Wiley, 2006, 17 (2), pp.109--126. 〈10.1002/cav.v17:2〉
Liste complète des métadonnées

https://hal.inria.fr/inria-00473320
Contributeur : Julien Pettré <>
Soumis le : jeudi 15 avril 2010 - 10:26:32
Dernière modification le : mardi 11 septembre 2018 - 15:18:15

Identifiants

Collections

Citation

Julien Pettré, Jean-Paul Laumond. A motion capture-based control-space approach for walking mannequins. Computer Animation and Virtual Worlds, Wiley, 2006, 17 (2), pp.109--126. 〈10.1002/cav.v17:2〉. 〈inria-00473320〉

Partager

Métriques

Consultations de la notice

71