HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation

Neural networks as dynamical bases in function space

Konrad Weigl 1 Marc Berthod 1
1 PASTIS - Scene Analysis and Symbolic Image Processing
CRISAM - Inria Sophia Antipolis - Méditerranée
Abstract : We consider feedforward neural network such as multi-layer perceptrons as non-orthogonal bases in a function space, bases which span submanifolds of that space. The basis functions of that base are the functions computed by the neurons of the hidden layer. A function to be approximated is then a vector in the function space. The projection of that vector unto the submanifold spanned by the base is the function approximated by the neural network. That approximation is then optimal when the distance between the function to be approximated and its projection unto the submanifold is minimal by some metric. We compute this distance in sample space, i.e. that subspace of function space the dimensions of which correspond to the input samples we have of the function to be approximated. The objective of learning in such a network is thus to minimize the distance between the function to be approximated and its projection unto the submanifold. This is achieved via dynamically rotating and shifting the base in such a way that the distance above is minimal. That rotation and shifting is executed through modification of the parameters of the basis functions of the network. A convenient way of computing the projection is with the help of metric tensors, a tool from differential geometry. We call this new approach to learning projection learning. The basis functions to be used are arbitrary : Gaussian, Gabor, sigmoid, etc, etc., except that they must be differentiable in some sense with regards to their parameters/weights. We present the application of the paradigm and learning rule to multi-layer perceptrons as well as bases of multivariate Gaussians, discuss some other potential applications and present alternatives to the use of the metric tensor.
Document type :
Complete list of metadata

Contributor : Rapport de Recherche Inria Connect in order to contact the contributor
Submitted on : Wednesday, May 24, 2006 - 3:44:19 PM
Last modification on : Friday, February 4, 2022 - 3:19:28 AM
Long-term archiving on: : Monday, April 5, 2010 - 12:11:20 AM


  • HAL Id : inria-00074548, version 1



Konrad Weigl, Marc Berthod. Neural networks as dynamical bases in function space. [Research Report] RR-2124, INRIA. 1993. ⟨inria-00074548⟩



Record views


Files downloads