Low-Dimensional Manifolds Support Multiplexed Integrations in Recurrent Neural Networks
نویسندگان
چکیده
We study the learning dynamics and representations emerging in recurrent neural networks (RNNs) trained to integrate one or multiple temporal signals. Combining analytical numerical investigations, we characterize conditions under which an RNN with n neurons learns D(?n) scalar signals of arbitrary duration. show, for linear, ReLU, sigmoidal neurons, that internal state lives close a D-dimensional manifold, whose shape is related activation function. Each neuron therefore carries, various degrees, information about value all integrals. discuss deep analogy between our results concept mixed selectivity forged by computational neuroscientists interpret cortical recordings.
منابع مشابه
Multi-dimensional Recurrent Neural Networks
Recurrent neural networks (RNNs) have proved effective at one dimensional sequence learning tasks, such as speech and online handwriting recognition. Some of the properties that make RNNs suitable for such tasks, for example robustness to input warping, and the ability to access contextual information, are also desirable in multidimensional domains. However, there has so far been no direct way ...
متن کاملLow dimensional flat manifolds with some classes of Finsler metric
Flat Riemannian manifolds are (up to isometry) quotient spaces of the Euclidean space R^n over a Bieberbach group and there are an exact classification of of them in 2 and 3 dimensions. In this paper, two classes of flat Finslerian manifolds are stuided and classified in dimensions 2 and 3.
متن کاملOpening the Black Box: Low-Dimensional Dynamics in High-Dimensional Recurrent Neural Networks
Recurrent neural networks (RNNs) are useful tools for learning nonlinear relationships between time-varying inputs and outputs with complex temporal dependencies. Recently developed algorithms have been successful at training RNNs to perform a wide variety of tasks, but the resulting networks have been treated as black boxes: their mechanism of operation remains unknown. Here we explore the hyp...
متن کاملInput space bifurcation manifolds of recurrent neural networks
We derive analytical expressions of local codimension-1 bifurcations for a fully connected, additive, discrete-time recurrent neural network (RNN), where we regard the external inputs as bifurcation parameters. The complexity of the bifurcation diagrams obtained increases exponentially with the number of neurons. We show that a three-neuron cascaded network can serve as a universal oscillator, ...
متن کاملOn some generalized recurrent manifolds
The object of the present paper is to introduce and study a type of non-flat semi-Riemannian manifolds, called, super generalized recurrent manifolds which generalizes both the notion of hyper generalized recurrent manifolds [A.A. Shaikh and A. Patra, On a generalized class of recurrent manifolds, Arch. Math. (Brno) 46 (2010) 71--78.] and weakly generalized recurrent manifolds ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Computation
سال: 2021
ISSN: ['0899-7667', '1530-888X']
DOI: https://doi.org/10.1162/neco_a_01366