Lower Bounds for Approximation of Some Classes of Lebesgue Measurable Functions by Sigmoidal Neural Networks
نویسندگان
چکیده
We propose a general method for estimating the distance between a compact subspace K of the space L([0, 1]) of Lebesgue measurable functions defined on the hypercube [0, 1], and the class of functions computed by artificial neural networks using a single hidden layer, each unit evaluating a sigmoidal activation function. Our lower bounds are stated in terms of an invariant that measures the oscillations of functions of the space K around the origin. As an application we estimate the minimal number of neurons required to approximate bounded functions satisfying uniform Lipschitz conditions of order α with accuracy .
منابع مشابه
Lower Bounds on the Complexity of Approximating Continuous Functions by Sigmoidal Neural Networks
We calculate lower bounds on the size of sigmoidal neural networks that approximate continuous functions. In particular, we show that for the approximation of polynomials the network size has to grow as O((logk)1/4) where k is the degree of the polynomials. This bound is valid for any input dimension, i.e. independently of the number of variables. The result is obtained by introducing a new met...
متن کاملLearning with recurrent neural networks
This thesis examines so-called folding neural networks as a mechanism for machine learning. Folding networks form a generalization of partial recurrent neural networks such that they are able to deal with tree structured inputs instead of simple linear lists. In particular, they can handle classical formulas { they were proposed originally for this purpose. After a short explanation of the neur...
متن کاملOn the Complexity of Computing and Learning with Multiplicative Neural Networks
In a great variety of neuron models, neural inputs are combined using the summing operation. We introduce the concept of multiplicative neural networks that contain units that multiply their inputs instead of summing them and thus allow inputs to interact nonlinearly. The class of multiplicative neural networks comprises such widely known and well-studied network types as higher-order networks ...
متن کاملComputing Time Lower Bounds for Recurrent Sigmoidal Neural Networks
Recurrent neural networks of analog units are computers for realvalued functions. We study the time complexity of real computation in general recurrent neural networks. These have sigmoidal, linear, and product units of unlimited order as nodes and no restrictions on the weights. For networks operating in discrete time, we exhibit a family of functions with arbitrarily high complexity, and we d...
متن کاملUniform Approximation and Thecomplexity of Neural
This work studies some of the approximating properties of feedforward neural networks as a function of the number of nodes. Two cases are considered: sigmoidal and radial basis function networks. Bounds for the approximation error are given. The methods through which we arrive at the bounds are constructive. The error studied is the L1 or sup error.
متن کامل