Learning to Parse Grounded Language using Reservoir Computing - Archive ouverte HAL Access content directly
Conference Papers Year : 2019

Learning to Parse Grounded Language using Reservoir Computing

(1) , (2)


Recently new models for language processing and learning using Reservoir Computing have been popular. However, these models are typically not grounded in sensorimotor systems and robots. In this paper, we develop a model of Reservoir Computing called Reservoir Parser (ResPars) for learning to parse Natural Language from grounded data coming from humanoid robots. Previous work showed that ResPars is able to do syntactic generalization over different sentences (surface structure) with the same meaning (deep structure). We argue that such ability is key to guide linguistic generalization in a grounded architecture. We show that ResPars is able to generalize on grounded compositional semantics by combining it with Incremental Recruitment Language (IRL). Additionally, we show that ResPars is able to learn to generalize on the same sentences, but not processed word by word, but as an unsegmented sequence of phonemes. This ability enables the architecture to not rely only on the words recognized by a speech recognizer, but to process the sub-word level directly. We additionally test the model's robustness to word error recognition.
Fichier principal
Vignette du fichier
HinautSpranger2019_ICDL_camera-ready.pdf (1005 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-02422157 , version 1 (20-12-2019)



Xavier Hinaut, Michael Spranger. Learning to Parse Grounded Language using Reservoir Computing. ICDL-Epirob 2019 - Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics, Aug 2019, Olso, Norway. ⟨10.1109/devlrn.2019.8850718⟩. ⟨hal-02422157⟩
67 View
176 Download



Gmail Facebook Twitter LinkedIn More