Input space bifurcation manifolds of recurrent neural networks
نویسندگان
چکیده
We derive analytical expressions of local codimension-1 bifurcations for a fully connected, additive, discrete-time recurrent neural network (RNN), where we regard the external inputs as bifurcation parameters. The complexity of the bifurcation diagrams obtained increases exponentially with the number of neurons. We show that a three-neuron cascaded network can serve as a universal oscillator, whose amplitude and frequency can be completely controlled by input parameters.
منابع مشابه
Parameter Space Structure of Continuous-Time Recurrent Neural Networks
A fundamental challenge for any general theory of neural circuits is how to characterize the structure of the space of all possible circuits over a given model neuron. As a first step in this direction, this letter begins a systematic study of the global parameter space structure of continuous-time recurrent neural networks (CTRNNs), a class of neural models that is simple but dynamically unive...
متن کاملBifurcations of Recurrent Neural Networks in Gradient Descent Learning
Asymptotic behavior of a recurrent neural network changes qualitatively at certain points in the parameter space, which are known as \bifurcation points". At bifurcation points, the output of a network can change discontinuously with the change of parameters and therefore convergence of gradient descent algorithms is not guaranteed. Furthermore, learning equations used for error gradient estima...
متن کاملMulti-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks
Modelling and forecasting Stock market is a challenging task for economists and engineers since it has a dynamic structure and nonlinear characteristic. This nonlinearity affects the efficiency of the price characteristics. Using an Artificial Neural Network (ANN) is a proper way to model this nonlinearity and it has been used successfully in one-step-ahead and multi-step-ahead prediction of di...
متن کاملThe Hopf bifurcation analysis on a time-delayed recurrent neural network in the frequency domain
2000 MSC: 92B20 34C23 37C27 93C80 Keywords: Recurrent neural networks Distributed delays Hopf bifurcation Frequency domain Generalized Nyquist stability criterion a b s t r a c t In this paper, a class of recurrent neural networks with distributed delays and a strong kernel is studied. It is shown that the Hopf bifurcation occurs as the bifurcation parameter, the mean delay, passes a critical v...
متن کاملInput Space Bifurcation Manifolds of RNNs
We derive analytical expressions of local codim-1-bifurcations for a fully connected, additive, discrete-time RNN, where we regard the external inputs as bifurcation parameters. The complexity of the bifurcation diagrams obtained increases exponentially with the number of neurons. We show that a three-neuron cascaded network can serve as a universal oscillator, whose amplitude and frequency can...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neurocomputing
دوره 64 شماره
صفحات -
تاریخ انتشار 2005