Skip to Main content Skip to Navigation
New interface
Journal articles

From motor to visually guided bimanual affordance learning

Abstract : The mechanisms of how the brain orchestrates multi-limb joint action have yet to be elucidated and few computational sensorimotor (SM) learning approaches have dealt with the problem of acquiring bimanual affordances. We propose a series of bidirectional (forward/inverse) SM maps and its associated learning processes that generalize from uni- to bimanual interaction (and affordances) naturally, reinforcing the motor equivalence property. The SM maps range from a SM nature to a solely sensory one: full body control, delta SM control (through small action changes), delta sensory co-variation (how body-related perceptual cues covariate with object-related ones). We make several contributions on how these SM maps are learned: (1) Context and Behavior-Based Babbling: generalizing goal babbling to the interleaving of absolute and local goals including guidance of reflexive behaviors; (2) Event-Based Learning: learning steps are driven by visual, haptic events; and (3) Affordance Gradients: the vectorial field gradients in which an object can be manipulated. Our modeling of bimanual affordances is in line with current robotic research in forward visuomotor mappings and visual servoing, enforces the motor equivalence property, and is also consistent with neurophysiological findings like the multiplicative encoding scheme.
Complete list of metadata
Contributor : Clément Moulin-Frier Connect in order to contact the contributor
Submitted on : Tuesday, November 23, 2021 - 7:56:50 PM
Last modification on : Tuesday, August 2, 2022 - 4:24:38 AM



Martí Sánchez-Fibla, Sébastien Forestier, Clément Moulin-Frier, Jordi-Ysard Puigbò, Paul Fmj Verschure. From motor to visually guided bimanual affordance learning. Adaptive Behavior, 2020, 28 (2), pp.63-78. ⟨10.1177/1059712319855836⟩. ⟨hal-03445197⟩



Record views