Skip to Main content Skip to Navigation
Journal articles

A parallel growing architecture for self-organizing maps with unsupervised learning

Abstract : Self-organizing maps (SOMs) have become popular for tasks in data visualization, pattern classification or natural language processing and can be seen as one of the major concepts for artificial neural networks of today. Their general idea is to approximate a high dimensional and previously unknown input distribution by a lower-dimensional neural network structure with the goal to model the topology of the input space as close as possible. Classical SOMs read the input values in random but sequential order one by one and thus adjust the network structure over space: the network will be built while reading larger and larger parts of the input. In contrast to this approach, we present a SOM that processes the whole input in parallel and organizes itself over time. The main reason for parallel input processing lies in the fact that knowledge can be used to recognize parts of patterns in the input space that have already been learned. This way, networks can be developed that do not reorganize their structure from scratch every time a new set of input vectors is presented, but rather adjust their internal architecture in accordance with previous mappings. One basic application could be a modeling of the whole-part relationship through layered architectures.
Complete list of metadata
Contributor : Daniel Szer <>
Submitted on : Monday, December 26, 2005 - 3:57:24 PM
Last modification on : Friday, February 26, 2021 - 3:28:05 PM




Iren Valova, Daniel Szer, Natacha Gueorguieva, Alexandre Buer. A parallel growing architecture for self-organizing maps with unsupervised learning. Neurocomputing, Elsevier, 2005, 68 (October 2005), pp.177-195. ⟨10.1016/j.neucom.2004.11.025⟩. ⟨inria-00000961⟩



Record views