HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

Geometric Interpretation of Nonlinear Approximation Capability for Feedforward Neural Networks.

Abstract : This paper presents a preliminary study on the nonlinear approximation capability of feedforward neural networks (FNNs) via a geometric approach. Three simplest FNNs with at most four free parameters are defined and investigated. By approximations on one-dimensional functions, we observe that the Chebyshev-polynomials, Gaussian, and sigmoidal FNNs are ranked in order of providing more varieties of nonlinearities. If neglecting the compactness feature inherited by Gaussian neural networks, we consider that the Chebyshev-polynomial-based neural networks will be the best among three types of FNNs in an efficient use of free parameters. This work is supported by Natural Science of Foundation of China (#60275025, #60121302).
Document type :
Conference papers
Complete list of metadata

https://hal.inria.fr/inria-00122761
Contributor : Chine Publications Liama Connect in order to contact the contributor
Submitted on : Thursday, January 4, 2007 - 4:40:54 PM
Last modification on : Saturday, October 9, 2021 - 4:06:41 AM

Links full text

Identifiers

Collections

Citation

Hu Bao-Gang, Xing Hong-Jie, Yang Yu-Jiu. Geometric Interpretation of Nonlinear Approximation Capability for Feedforward Neural Networks.. Advances in Neural Networks - ISNN 2004, International Symposium on Neural Networks, Aug 2004, Dalian / China, China. pp.7-13, ⟨10.1007/b99834⟩. ⟨inria-00122761⟩

Share

Metrics

Record views

85