Quick Training Algorithm for Extra Reduced Size Lattice-Ladder Multilayer Perceptrons

نویسنده

  • Dalius Navakauskas
چکیده

A quick gradient training algorithm for a specific neural network structure called an extra reduced size lattice-ladder multilayer perceptron is introduced. Presented derivation of the algorithm utilizes recently found by author simplest way of exact computation of gradients for rotation parameters of lattice-ladder filter. Developed neural network training algorithm is optimal in terms of minimal number of constants, multiplication and addition operations, while the regularity of the structure is also preserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Training Algorithm for Extra Reduced Size Lattice–Ladder Multilayer Perceptrons

A quick gradient training algorithm for a specific neural network structure called an extra reduced size lattice–ladder multilayer perceptron is introduced. Presented derivation of the algorithm utilizes recently found by author simplest way of exact computation of gradients for rotation parameters of lattice–ladder filter. Developed neural network training algorithm is optimal in terms of mini...

متن کامل

Speeding up the Training of Lattice–Ladder Multilayer Perceptrons

A lattice–ladder multilayer perceptron (LLMLP) is an appealing structure for advanced signal processing in a sense that it is nonlinear, possesses infinite impulse response and stability monitoring of it during training is simple. However, even moderate implementation of LLMLP training hinders the fact that a lot of storage and computation power must be allocated. In this paper we deal with the...

متن کامل

Improve an Efficiency of Feedforward Multilayer Perceptrons by Serial Training

The Feedforward Multilayer Perceptrons network is a widely used model in Artificial Neural Network using the backpropagation algorithm for real world data. There are two common ways to construct Feedforward Multilayer Perceptrons network, that is, either taking a large network and then pruning away the irrelevant nodes or starting from a small network and then adding new relevant nodes. An Arti...

متن کامل

The Effect of Training Set Size for the Performance of Neural Networks of Classification

Even though multilayer perceptrons and radial basis function networks belong to the class of artificial neural networks and they are used for similar tasks, they have very different structures and training mechanisms. So, some researchers showed better performance with radial basis function networks, while others showed some different results with multilayer perceptrons. This paper compares the...

متن کامل

Supervised Models C1.2 Multilayer perceptrons

This section introduces multilayer perceptrons, which are the most commonly used type of neural network. The popular backpropagation training algorithm is studied in detail. The momentum and adaptive step size techniques, which are used for accelerated training, are discussed. Other acceleration techniques are briefly referenced. Several implementation issues are then examined. The issue of gen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Informatica, Lith. Acad. Sci.

دوره 14  شماره 

صفحات  -

تاریخ انتشار 2003