Skip to Main content Skip to Navigation
Conference papers

Inferring symbols from demonstrations to support vector-symbolic planning in a robotic assembly task

Abstract : While deep reinforcement learning is increasingly used to solve complex sensorimotor tasks, it requires vast amounts of experience to achieve adequate performance. Humans, by comparison, can correctly learn many tasks from just a small handful of demonstrations. A common explanation of this difference points to the human use of symbolic representations that constrain search spaces, enable instruction following, and promote the transfer and reuse of knowledge across tasks. In prior work, we developed neural network models that use vector-symbolic representations to plan complex actions, but the challenge of learning such representations and grounding them in sensorimotor data remains unsolved. As a step towards addressing this challenge, we use a programming-by-demonstration approach that automatically extracts symbols from sensorimotor data during a simulated robotic assembly task. The system first observes a small number of demonstrations of the task, and then constructs models of the associated states and actions. The system infers task-relevant states by clustering the positions and orientations of objects manipulated during the demonstrations. It then builds vector-symbolic representations of these states, and of actions defined in terms of transitions between states. This approach allows rapid and automatic sensory grounding of vector-symbolic representations, which in turn support flexible, robust, and biologically plausible control.
Complete list of metadatas

https://hal.inria.fr/hal-03041290
Contributor : Xavier Hinaut <>
Submitted on : Friday, December 4, 2020 - 6:43:12 PM
Last modification on : Tuesday, December 8, 2020 - 3:07:09 AM

File

Yao2020_1st-SMILES-workshop_IC...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03041290, version 1

Collections

Citation

Xueyang Yao, Saeejith Nair, Peter Blouw, Bryan Tripp. Inferring symbols from demonstrations to support vector-symbolic planning in a robotic assembly task. ICDL 2020 - 1st SMILES (Sensorimotor Interaction, Language and Embodiment of Symbols) workshop, Nov 2020, Valparaiso / Virtual, Chile. ⟨hal-03041290⟩

Share

Metrics

Record views

16

Files downloads

43