HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

A Multi-level Abstraction Model for Competitive Learning Neural Networks

Randa Kassab 1 Jean-Charles Lamirel 2
1 CORTEX - Neuromimetic intelligence
INRIA Lorraine, LORIA - Laboratoire Lorrain de Recherche en Informatique et ses Applications
2 TALARIS - Natural Language Processing: representation, inference and semantics
Inria Nancy - Grand Est, LORIA - Laboratoire Lorrain de Recherche en Informatique et ses Applications
Abstract : Competitive learning neural networks are powerful analytical tools for data clustering and topology-preserving visualization. However, they are limited in the sense of being unable to achieve more than one task on the same network. When applied to clustering tasks, every neuron unit is supposed to represent one of the inherent data clusters, while learning topology requires much more neuron units. The aim of this work is to figure out how connections between neurons - specially those inserted by a competitive Hebbian learning - can be exploited to construct higher abstraction levels of a topology-preserving neural network. The idea is to devote the basic level of the network to detailed description of the data while taking advantage of higher abstraction levels to facilitate quantitative analysis and structural overview of the network. The abstraction is done in two different fashions: macro and micro abstractions. In macro abstraction, the objective is to cluster the first-level neuron units into their principal groups corresponding to the clusters inherent in the data. In contrast, micro abstraction is designed to capture underlying clusters according to a given granularity degree. The abstraction-level units are also connected to reflect a simplified structure of the data distribution. At the end, the basic level of the network can be hierarchically organized with one or more abstraction levels, permitting both qualitative and quantitative analysis of the data. Simulations on many synthetic data shown the relevance of proposed model.
Document type :
Conference papers
Complete list of metadata

Contributor : Randa Kassab Connect in order to contact the contributor
Submitted on : Monday, October 20, 2008 - 4:02:59 PM
Last modification on : Friday, February 4, 2022 - 3:31:01 AM


  • HAL Id : inria-00332320, version 1



Randa Kassab, Jean-Charles Lamirel. A Multi-level Abstraction Model for Competitive Learning Neural Networks. Artificial Intelligence and Applications - AIA 2008, Feb 2008, Innsbruck, Austria, Austria. pp.97-103. ⟨inria-00332320⟩



Record views