A new IIR-MLP learning algorithm for on-line signal processing
نویسندگان
چکیده
In this paper we propose a new learning algorithm for locally recurrent neural networks, called Truncated Recursive Back Propagation which can be easily implemented on-line with good performance. Moreover it generalises the algorithm proposed by Waibel et al. for TDNN, and includes the Back and Tsoi algorithm as well as BPS and standard on-line Back Propagation as particular cases. The proposed algorithm has a memory and computational complexity that can be adjusted by a careful choice of two parameters h and h' and so it is more flexible than a previous algorithm by us. Although for the sake of brevity we present the new algorithm only for IIR-MLP networks, it can be applied also to any locally recurrent neural network. Some computer simulations of dynamical system identification tests, reported in literature, are also presented to assess the performance of the proposed algorithm applied to the IIR-MLP.
منابع مشابه
Fast adaptive IIR-MLP neural networks for signal processing applications
Neural networks with internal temporal dynamic can be applied to non-linear DSP problems. The classical fully connected recurrent architectures, can be replaced by less complex neural networks, based on the well known MultiLayer Perceptron (MLP) where the temporal dynamic is modelled by replacing each synapses either with a FIR filter or with an IIR filter. A general learning algorithm (Back-Pr...
متن کاملCasual BackPropagation Through Time for Locally Recurrent Neural Networks
This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...
متن کاملCausal Back Propagation through Time for Locally Recurrent Neural Networks
This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback MultiLayered Networks (LF MLN), are compared to more traditional neural networks, i.e. static MLP with input and/or output buffer...
متن کاملOn-line learning algorithms for neural networks with IIR synapses
This paper is focused on the learning algorithms for dynamic multilayer perceptron neural networks where each neuron synapsis is modelled by an infinite impulse response (IIR) filter (IIR MLP). In particular, the Backpropagation Through Time (BPTT) algorithm and its less demanding approximated on-line versions are considered. In fact it is known that the BPTT algorithm is not causal and therefo...
متن کاملOn-Line Algorithms for Neural Networks with IIR Synapses
This paper is focused on the learning algorithms for dynamic multilayer perceptron neural networks where each neuron synapsis is modelled by an infinite impulse response (IIR) filter (IIR MLP). In particular, the Backpropagation Through Time (BPTT) algorithm and its less demanding approximated on-line versions are considered. In fact it is known that the BPTT algorithm is not causal and therefo...
متن کامل