Neural networks and approximation by superposition of Gaussians
نویسنده
چکیده
The aim of this paper is to discuss a nonlinear approximation problem relevant to the approximation of data by radial-basis-function neural networks. The approximation is based on superpositions of translated Gaussians. The method used enables us to give explicit approximations and error bounds. New connections between this problem and sampling theory are exposed, but the method used departs radically from those commonly used to obtain sampling results since (i) it applies to signals that are not band-limited, and possibly even discontinuous (ii) the sampling knots (the centers of the radial-basis functions) need not be equidistant (iii) the basic approximation building block is the Gaussian, not the usual sinc kernel. The results given o er an answer to the following problem: how complex should a neural network be in order to be able to approximate a given signal to better than a certain prescribed accuracy? The results show that O(1=N) accuracy is possible with a network of N basis functions.
منابع مشابه
Comparison of the performances of neural networks specification, the Translog and the Fourier flexible forms when different production technologies are used
This paper investigates the performances of artificial neural networks approximation, the Translog and the Fourier flexible functional forms for the cost function, when different production technologies are used. Using simulated data bases, the author provides a comparison in terms of capability to reproduce input demands and in terms of the corresponding input elasticities of substitution esti...
متن کاملSTRUCTURAL DAMAGE DETECTION BY MODEL UPDATING METHOD BASED ON CASCADE FEED-FORWARD NEURAL NETWORK AS AN EFFICIENT APPROXIMATION MECHANISM
Vibration based techniques of structural damage detection using model updating method, are computationally expensive for large-scale structures. In this study, after locating precisely the eventual damage of a structure using modal strain energy based index (MSEBI), To efficiently reduce the computational cost of model updating during the optimization process of damage severity detection, the M...
متن کاملComparing fixed and variable-width Gaussian networks
The role of width of Gaussians in two types of computational models is investigated: Gaussian radial-basis-functions (RBFs) where both widths and centers vary and Gaussian kernel networks which have fixed widths but varying centers. The effect of width on functional equivalence, universal approximation property, and form of norms in reproducing kernel Hilbert spaces (RKHS) is explored. It is pr...
متن کاملActive Learning with Statistical
For many types of machine learning algorithms, one can compute the statistically \op-timal" way to select training data. In this paper, we review how optimal data selection techniques have been used with feedforward neural networks. We then show how the same principles may be used to select data for two alternative, statistically-based learning ar-chitectures: mixtures of Gaussians and locally ...
متن کاملActive Learning with Statistical Models
For many types of machine learning algorithms, one can compute the statistically \optimal" way to select training data. In this paper, we review how optimal data selection techniques have been used with feedforward neural networks. We then show how the same principles may be used to select data for two alternative, statistically-based learning architectures: mixtures of Gaussians and locally we...
متن کامل