Theoretical Properties of Projection Based Multilayer Perceptrons with Functional Inputs
نویسندگان
چکیده
منابع مشابه
Theoretical properties of functional Multi Layer Perceptrons
In this paper, we study a natural extension of Multi Layer Perceptrons (MLP) to functional inputs. We show that fundamental results for numerical MLP can be extended to functional MLP. We obtain universal approximation results that show the expressive power of functional MLP is comparable to the one of numerical MLP. We obtain consistency results which imply that optimal parameters estimation f...
متن کاملFunctional preprocessing for multilayer perceptrons
In many applications, high dimensional input data can be considered as sampled functions. We show in this paper how to use this prior knowledge to implement functional preprocessings that allow to consistently reduce the dimension of the data even when they have missing values. Preprocessed functions are then handled by a numerical MLP which approximates the theoretical functional MLP. A succes...
متن کاملQuantile regression with multilayer perceptrons
We consider nonlinear quantile regression involving multilayer perceptrons (MLP). In this paper we investigate the asymptotic behavior of quantile regression in a general framework. First by allowing possibly non-identifiable regression models like MLP's with redundant hidden units, then by relaxing the conditions on the density of the noise. In this paper, we present an universal bound for the...
متن کاملMultilayer Perceptrons based on Fuzzy Flip- Flops
The concept of fuzzy flip-flop was introduced in the middle of 1980’s by Hirota (with his students). The Hirota Lab recognized the essential importance of the concept of a fuzzy extension of a sequential circuit and the notion of fuzzy memory. From this point of view they proposed alternatives for “fuzzifying” digital flip-flops. The starting elementary digital units were the binary J-K flipflo...
متن کاملFast training of multilayer perceptrons
Training a multilayer perceptron by an error backpropagation algorithm is slow and uncertain. This paper describes a new approach which is much faster and certain than error backpropagation. The proposed approach is based on combined iterative and direct solution methods. In this approach, we use an inverse transformation for linearization of nonlinear output activation functions, direct soluti...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Processing Letters
سال: 2006
ISSN: 1370-4621,1573-773X
DOI: 10.1007/s11063-005-3100-2