Supervised Models C1.2 Multilayer perceptrons
نویسنده
چکیده
This section introduces multilayer perceptrons, which are the most commonly used type of neural network. The popular backpropagation training algorithm is studied in detail. The momentum and adaptive step size techniques, which are used for accelerated training, are discussed. Other acceleration techniques are briefly referenced. Several implementation issues are then examined. The issue of generalization is studied next. Several measures to improve network generalization are discussed, including cross validation, choice of network size, network pruning, constructive algorithms and regularization. Recurrent networks are then studied, both in the fixed point mode, with the recurrent backpropagation algorithm, and in the sequential mode, with the unfolding in time algorithm. A reference is also made to time-delay neural networks. The section also includes brief mention of a large number of applications of multilayer perceptrons, with pointers to the bibliography.
منابع مشابه
Local linear perceptrons for classification
A structure composed of local linear perceptrons for approximating global class discriminants is investigated. Such local linear models may be combined in a cooperative or competitive way. In the cooperative model, a weighted sum of the outputs of the local perceptrons is computed where the weight is a function of the distance between the input and the position of the local perceptron. In the c...
متن کاملSupport Vector Machine Based Facies Classification Using Seismic Attributes in an Oil Field of Iran
Seismic facies analysis (SFA) aims to classify similar seismic traces based on amplitude, phase, frequency, and other seismic attributes. SFA has proven useful in interpreting seismic data, allowing significant information on subsurface geological structures to be extracted. While facies analysis has been widely investigated through unsupervised-classification-based studies, there are few cases...
متن کاملA Comparison between the Performance of Feed Forward Neural Networks and the Supervised Growing Neural Gas Algorithm
AI 057 Abstract The Supervised Growing Neural Gas algorithm (SGNG) provides an interesting alternative to standard multilayer perceptrons (MLP). A comparison is drawn between the performance of SGNG and MLP in the domain of function mapping. A further eld of interest is classiication power, which has been investigated with real data taken by PS197 at CERN. The characteristics of the two network...
متن کاملGeneralizing the Convolution Operator to Extend CNNs to Irregular Domains
Convolutional Neural Networks (CNNs) have become the state-of-the-art in supervised learning vision tasks. Their convolutional filters are of paramount importance for they allow to learn patterns while disregarding their locations in input images. When facing highly irregular domains, generalized convolutional operators based on an underlying graph structure have been proposed. However, these o...
متن کاملDensity Networks 1 Density Modelling
A density network is a neural network that maps from unobserved inputs to observable outputs. The inputs are treated as latent variables so that, for given network parameters, a non{trivial probability density is deened over the output variables. This probabilistic model can be trained by various Monte Carlo methods. The model can discover a description of the observed data in terms of an under...
متن کامل