Skip to Main content Skip to Navigation
Conference papers

Visually-guided grasping while walking on a humanoid robot

Abstract : In this paper, we apply a general framework for building complex whole-body control for highly redundant robot, and we propose to implement it for visually-guided grasping while walking on a humanoid robot. The key idea is to divide the control into several sensor-based control tasks that are simultaneously executed by a general structure called stack of tasks. This structure enables a very simple access for task sequencing, and can be used for task-level control. This framework was applied for a visual servoing task. The robot walks along a planned path, keeping the specified object in the middle of its field of view and finally, when it is close enough, the robot grasps the object while walking.
Document type :
Conference papers
Complete list of metadata

Cited literature [24 references]  Display  Hide  Download

https://hal.inria.fr/inria-00350654
Contributor : Eric Marchand <>
Submitted on : Wednesday, January 7, 2009 - 2:27:19 PM
Last modification on : Tuesday, June 15, 2021 - 4:06:52 PM
Long-term archiving on: : Tuesday, June 8, 2010 - 4:55:12 PM

File

2007_icra_mansard.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : inria-00350654, version 1

Citation

Nicolas Mansard, Olivier Stasse, François Chaumette, K. Yokoi. Visually-guided grasping while walking on a humanoid robot. IEEE Int. Conf. on Robotics and Automation, ICRA'07, 2007, Roma, Italy. pp.3041-3047. ⟨inria-00350654⟩

Share

Metrics

Record views

497

Files downloads

525