Visually-guided grasping while walking on a humanoid robot

Abstract : In this paper, we apply a general framework for building complex whole-body control for highly redundant robot, and we propose to implement it for visually-guided grasping while walking on a humanoid robot. The key idea is to divide the control into several sensor-based control tasks that are simultaneously executed by a general structure called stack of tasks. This structure enables a very simple access for task sequencing, and can be used for task-level control. This framework was applied for a visual servoing task. The robot walks along a planned path, keeping the specified object in the middle of its field of view and finally, when it is close enough, the robot grasps the object while walking.
Type de document :
Communication dans un congrès
IEEE Int. Conf. on Robotics and Automation, ICRA'07, 2007, Roma, Italy, France. pp.3041-3047, 2007
Liste complète des métadonnées

Littérature citée [24 références]  Voir  Masquer  Télécharger

https://hal.inria.fr/inria-00350654
Contributeur : Eric Marchand <>
Soumis le : mercredi 7 janvier 2009 - 14:27:19
Dernière modification le : jeudi 23 août 2018 - 16:10:01
Document(s) archivé(s) le : mardi 8 juin 2010 - 16:55:12

Fichier

2007_icra_mansard.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : inria-00350654, version 1

Citation

Nicolas Mansard, Olivier Stasse, François Chaumette, K. Yokoi. Visually-guided grasping while walking on a humanoid robot. IEEE Int. Conf. on Robotics and Automation, ICRA'07, 2007, Roma, Italy, France. pp.3041-3047, 2007. 〈inria-00350654〉

Partager

Métriques

Consultations de la notice

399

Téléchargements de fichiers

248