Methods for Encoding in Multilayer Feed-Forward Neural Networks

نویسندگان

  • Emili Elizalde
  • Sergio Gómez
  • August Romeo
چکیده

Neural network techniques for encodlng-decoding processes have been developed. The net we have devised can work like it memory retrieval system in the sense of Hopfield, Feinstein and Palmex. Its behaviour for 2 R (R E N) input units has some special interesting features. In particular, the accsssibilities for each initial symbol may be explicitly computed. Although thermal noise may muddle the code, we show how it can statistically rid the result of unwanted sequences while maintaining the network accuracy within a given bound. I n t r o d u c t i o n The idea of using layered neural networks for multiple tasks has become more and more appealing since Rosenblatt's original perceptron model was object of the first serious criticism which led to far-reaching developments [1]. Among this type of structures, the most interesting group are the multilayer feed-forward nets containing intermediate --or hidden-layers. The working of any net is determined by the relative strength of the links among units (neurons), usually given by a weight or connection exchange matrix. Typica~y, in a feed-forward multilayer network each unit computes a nonlinear function of the weighted sum of incoming signals from the previous layer reaching its own site, and sends the outcome on to the following layer. This process ends when the emerging signal arrives at the output units, where the result is read off. Mdltilayer feed-forward neural networks are specially adequate for encoding [2], understood as the turning of p possible input patterns described by N digital units into a determined set of p output patterns on M units. In the minimal set-up, there is just one hidden layer of R hidden units forming a binary representation of the N inputs, i.e. with /~ = log2N neurons (Fig. 1). We will take M = N = p in order to have the same number of neurons at the input and output layers. The first case to be considered is that in which we have sets ---or alphabets-of unary patterns, i.e. binary sequences in which one unit is on and the rest are off: 1 ~ 1 ~ ~+1 N ~ " = ( , . . . . , + , . . . . ) , ~ = 1 . . . . . N. (1) Afterwards, we shall consider arbitrary input and output patterns. When the number of output units has the form 2 R, with R E N, our model exhibits remarkable characteristics. In particula L no thresholds will be needed in this case. A no-go theorem on the possibility of general 3-step encoding will be proven. At the end, advantage will be taken of the introduction of a moderate amount of thermal noise. For the intermediate and output layers, the state of each unit at a given moment will be a (generally nonlinear) function of the weighted sum of the signals feeding into it. Since we use binary units, we take the

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Monthly runoff forecasting by means of artificial neural networks (ANNs)

Over the last decade or so, artificial neural networks (ANNs) have become one of the most promising tools formodelling hydrological processes such as rainfall runoff processes. However, the employment of a single model doesnot seem to be an appropriate approach for modelling such a complex, nonlinear, and discontinuous process thatvaries in space and time. For this reason, this study aims at de...

متن کامل

Direct Method for Training Feed-Forward Neural Networks Using Batch Extended Kalman Filter for Multi-Step-Ahead Predictions

This paper is dedicated to the long-term, or multi-step-ahead, time series prediction problem. We propose a novel method for training feed-forward neural networks, such as multilayer perceptrons, with tapped delay lines. Special batch calculation of derivatives called Forecasted Propagation Through Time and batch modification of the Extended Kalman Filter are introduced. Experiments were carrie...

متن کامل

A Comparison of Feed Forward Neural Network Architectures for Piano Music Transcription

This paper presents our experiences with the use of feed forward neural networks for piano chord recognition and polyphonic piano music transcription. Our final goal is to build a transcription system that would transcribe polyphonic piano music over the entire piano range. The central part of our system uses neural networks acting as pattern recognisers and extracting notes from the source aud...

متن کامل

Numerical treatment for nonlinear steady flow of a third grade‎ fluid in a porous half space by neural networks optimized

In this paper‎, ‎steady flow of a third-grade fluid in a porous half‎ space has been considered‎. ‎This problem is a nonlinear two-point‎ boundary value problem (BVP) on semi-infinite interval‎. ‎The‎ solution for this problem is given by a numerical method based on the feed-forward artificial‎ neural network model using radial basis activation functions trained with an interior point method‎. ...

متن کامل

Introduction to multi-layer feed-forward neural networks

Basic definitions concerning the multi-layer feed-forward neural networks are given. The back-propagation training algorithm is explained. Partial derivatives of the objective function with respect to the weight and threshold coefficients are derived. These derivatives are valuable for an adaptation process of the considered neural network. Training and generalisation of multi-layer feed-forwar...

متن کامل

Prediction of breeding values for the milk production trait in Iranian Holstein cows applying artificial neural networks

The artificial neural networks, the learning algorithms and mathematical models mimicking the information processing ability of human brain can be used non-linear and complex data. The aim of this study was to predict the breeding values for milk production trait in Iranian Holstein cows applying artificial neural networks. Data on 35167 Iranian Holstein cows recorded between 1998 to 2009 were ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1991