Skip to Main content Skip to Navigation
Conference papers

Learning Semantic Components from Subsymbolic Multimodal Perception

Olivier Mangin 1 Pierre-Yves Oudeyer 1
1 Flowers - Flowing Epigenetic Robots and Systems
Inria Bordeaux - Sud-Ouest, U2IS - Unité d'Informatique et d'Ingénierie des Systèmes
Abstract : Perceptual systems often include sensors from several modalities. However, existing robots do not yet sufficiently discover patterns that are spread over the flow of multimodal data they receive. In this paper we present a framework that learns a dictionary of words from full spoken utterances, together with a set of gestures from human demonstrations and the semantic connection between words and gestures. We explain how to use a nonnegative matrix factorization algorithm to learn a dictionary of components that represent meaningful elements present in the multimodal perception, without providing the system with a symbolic representation of the semantics. We illustrate this framework by showing how a learner discovers word-like components from observation of gestures made by a human together with spoken descriptions of the gestures, and how it captures the semantic association between the two.
Document type :
Conference papers
Complete list of metadata

Cited literature [21 references]  Display  Hide  Download
Contributor : Olivier Mangin <>
Submitted on : Monday, July 8, 2013 - 3:45:00 PM
Last modification on : Wednesday, July 3, 2019 - 10:48:04 AM
Long-term archiving on: : Wednesday, April 5, 2017 - 8:14:42 AM


Files produced by the author(s)


  • HAL Id : hal-00842453, version 1



Olivier Mangin, Pierre-Yves Oudeyer. Learning Semantic Components from Subsymbolic Multimodal Perception. Joint IEEE International Conference on Development and Learning an on Epigenetic Robotics (ICDL-EpiRob), Aug 2013, Osaka, Japan. ⟨hal-00842453⟩



Record views


Files downloads