Skip to Main content Skip to Navigation
New interface
Conference papers

Visually-guided grasping while walking on a humanoid robot

Abstract : In this paper, we apply a general framework for building complex whole-body control for highly redundant robot, and we propose to implement it for visually-guided grasping while walking on a humanoid robot. The key idea is to divide the control into several sensor-based control tasks that are simultaneously executed by a general structure called stack of tasks. This structure enables a very simple access for task sequencing, and can be used for task-level control. This framework was applied for a visual servoing task. The robot walks along a planned path, keeping the specified object in the middle of its field of view and finally, when it is close enough, the robot grasps the object while walking.
Document type :
Conference papers
Complete list of metadata

Cited literature [24 references]  Display  Hide  Download
Contributor : Eric Marchand Connect in order to contact the contributor
Submitted on : Wednesday, January 7, 2009 - 2:27:19 PM
Last modification on : Tuesday, August 2, 2022 - 3:56:34 AM
Long-term archiving on: : Tuesday, June 8, 2010 - 4:55:12 PM


Files produced by the author(s)


  • HAL Id : inria-00350654, version 1


Nicolas Mansard, Olivier Stasse, François Chaumette, K. Yokoi. Visually-guided grasping while walking on a humanoid robot. IEEE Int. Conf. on Robotics and Automation, ICRA'07, 2007, Roma, Italy. pp.3041-3047. ⟨inria-00350654⟩



Record views


Files downloads