A Multi-level Abstraction Model for Competitive Learning Neural Networks

Randa Kassab 1 Jean-Charles Lamirel 2
1 CORTEX - Neuromimetic intelligence
INRIA Lorraine, LORIA - Laboratoire Lorrain de Recherche en Informatique et ses Applications
2 TALARIS - Natural Language Processing: representation, inference and semantics
Inria Nancy - Grand Est, LORIA - Laboratoire Lorrain de Recherche en Informatique et ses Applications
Abstract : Competitive learning neural networks are powerful analytical tools for data clustering and topology-preserving visualization. However, they are limited in the sense of being unable to achieve more than one task on the same network. When applied to clustering tasks, every neuron unit is supposed to represent one of the inherent data clusters, while learning topology requires much more neuron units. The aim of this work is to figure out how connections between neurons - specially those inserted by a competitive Hebbian learning - can be exploited to construct higher abstraction levels of a topology-preserving neural network. The idea is to devote the basic level of the network to detailed description of the data while taking advantage of higher abstraction levels to facilitate quantitative analysis and structural overview of the network. The abstraction is done in two different fashions: macro and micro abstractions. In macro abstraction, the objective is to cluster the first-level neuron units into their principal groups corresponding to the clusters inherent in the data. In contrast, micro abstraction is designed to capture underlying clusters according to a given granularity degree. The abstraction-level units are also connected to reflect a simplified structure of the data distribution. At the end, the basic level of the network can be hierarchically organized with one or more abstraction levels, permitting both qualitative and quantitative analysis of the data. Simulations on many synthetic data shown the relevance of proposed model.
Type de document :
Communication dans un congrès
Artificial Intelligence and Applications - AIA 2008, Feb 2008, Innsbruck, Austria, Austria. pp.97-103, 2008
Liste complète des métadonnées

https://hal.inria.fr/inria-00332320
Contributeur : Randa Kassab <>
Soumis le : lundi 20 octobre 2008 - 16:02:59
Dernière modification le : jeudi 11 janvier 2018 - 06:21:35

Identifiants

  • HAL Id : inria-00332320, version 1

Collections

Citation

Randa Kassab, Jean-Charles Lamirel. A Multi-level Abstraction Model for Competitive Learning Neural Networks. Artificial Intelligence and Applications - AIA 2008, Feb 2008, Innsbruck, Austria, Austria. pp.97-103, 2008. 〈inria-00332320〉

Partager

Métriques

Consultations de la notice

517