نتایج جستجو برای: recurrent input

تعداد نتایج: 345825  

2017
Jonas Gehring Michael Auli David Grangier Denis Yarats Yann Dauphin

The prevalent approach to sequence to sequence learning maps an input sequence to a variable length output sequence via recurrent neural networks. We introduce an architecture based entirely on convolutional neural networks.1 Compared to recurrent models, computations over all elements can be fully parallelized during training and optimization is easier since the number of non-linearities is fi...

Journal: :IEEE transactions on neural networks 1996
Yoshua Bengio Paolo Frasconi

We consider problems of sequence processing and propose a solution based on a discrete-state model in order to represent past context. We introduce a recurrent connectionist architecture having a modular structure that associates a subnetwork to each state. The model has a statistical interpretation we call input-output hidden Markov model (IOHMM). It can be trained by the estimation-maximizati...

2005
J. Krishnaiah C. S. Kumar M. A. Faruqi

Many real-world processes tend to be chaotic and are not amenable to satisfactory analytical models. It has been shown here that for such chaotic processes represented through short chaotic noisy observed data, a multi-input and multi-output recurrent neural network can be built which is capable of capturing the process trends and predicting the behaviour for any given starting condition. It is...

2013
Michiel Hermans Benjamin Schrauwen

Time series often have a temporal hierarchy, with information that is spread out over multiple time scales. Common recurrent neural networks, however, do not explicitly accommodate such a hierarchy, and most research on them has been focusing on training algorithms rather than on their basic architecture. In this paper we study the effect of a hierarchy of recurrent neural networks on processin...

2008
Jianing Shi Jim Wielaard Paul Sajda

We investigate using a previously developed spiking neuron model of layer 4 of primary visual cortex (V1) [1] as a recurrent network whose activity is consequently linearly decoded, given a set of complex visual stimuli. Our motivation is based on the following: 1) Linear decoders have proven useful in analyzing a variety of neural signals, including spikes, firing rates, local field potentials...

2011
Chia-Feng Juang

Recurrent fuzzy neural networks (FNNs) have been widely applied to dynamic system processing problems. However, most recurrent FNNs focus on the use of type-1 fuzzy sets. This paper proposes a Mamdani-type recurrent interval type-2 FNN (M-RIT2FNN) that uses interval type-2 fuzzy sets in both rule antecedent and consequent parts. The reason for using interval type-2 fuzzy sets is to increase net...

2013
Michiel Hermans Benjamin Schrauwen

Time series often have a temporal hierarchy, with information that is spread out over multiple time scales. Common recurrent neural networks, however, do not explicitly accommodate such a hierarchy, and most research on them has been focusing on training algorithms rather than on their basic architecture. In this paper we study the effect of a hierarchy of recurrent neural networks on processin...

Journal: :CoRR 2016
Hendrik Strobelt Sebastian Gehrmann Bernd Huber Hanspeter Pfister Alexander M. Rush

Recurrent neural networks, and in particular long short-term memory networks (LSTMs), are a remarkably effective tool for sequence modeling that learn a dense black-box hidden representation of their sequential input. Researchers interested in better understanding these models have studied the changes in hidden state representations over time and noticed some interpretable patterns but also sig...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید