Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

A Universal Approximation Result for Difference of log-sum-exp Neural Networks

Giuseppe Carlo Calafiore 1 Stéphane Gaubert 2 Corrado Possieri 1 
CMAP - Centre de Mathématiques Appliquées - Ecole Polytechnique, Inria Saclay - Ile de France
Abstract : We show that a neural network whose output is obtained as the difference of the outputs of two feedforward networks with exponential activation function in the hidden layer and logarithmic activation function in the output node (LSE networks) is a smooth universal approximator of continuous functions over convex, compact sets. By using a logarithmic transform, this class of networks maps to a family of subtraction-free ratios of generalized posynomials, which we also show to be universal approximators of positive functions over log-convex, compact subsets of the positive orthant. The main advantage of Difference-LSE networks with respect to classical feedforward neural networks is that, after a standard training phase, they provide surrogate models for design that possess a specific difference-of-convex-functions form, which makes them optimizable via relatively efficient numerical methods. In particular, by adapting an existing difference-of-convex algorithm to these models, we obtain an algorithm for performing effective optimization-based design. We illustrate the proposed approach by applying it to data-driven design of a diet for a patient with type-2 diabetes.
Document type :
Preprints, Working Papers, ...
Complete list of metadata
Contributor : Stephane Gaubert Connect in order to contact the contributor
Submitted on : Thursday, December 26, 2019 - 10:56:12 AM
Last modification on : Friday, February 4, 2022 - 3:31:35 AM

Links full text


  • HAL Id : hal-02423871, version 1
  • ARXIV : 1905.08503


Giuseppe Carlo Calafiore, Stéphane Gaubert, Corrado Possieri. A Universal Approximation Result for Difference of log-sum-exp Neural Networks. 2019. ⟨hal-02423871⟩



Record views