نتایج جستجو برای: recurrent neural network

تعداد نتایج: 942527  

Journal: :CoRR 2017
Minmin Chen

We introduce MinimalRNN, a new recurrent neural network architecture that achieves comparable performance as the popular gated RNNs with a simplified structure. It employs minimal updates within RNN, which not only leads to efficient learning and testing but more importantly better interpretability and trainability. We demonstrate that by endorsing the more restrictive update rule, MinimalRNN l...

1998
E. D. Sontag Y. Qiao

This paper studies controllability properties of recurrent neural networks. The new contributions are: (1) an extension of a previous result to a slightly diierent model, (2) a formulation and proof of a necessary and suucient condition, and (3) an analysis of a low-dimensional case for which the hypotheses made in previous work do not apply.

Journal: :Journal of Machine Learning Research 2017
Herbert Jaeger

Biological brains can learn, recognize, organize, and re-generate large repertoires of temporal patterns. Here I propose a mechanism of neurodynamical pattern learning and representation, called conceptors, which offers an integrated account of a number of such phenomena and functionalities. It becomes possible to store a large number of temporal patterns in a single recurrent neural network. I...

2017
Keisuke Sakaguchi Kevin Duh Matt Post Benjamin Van Durme

The Cmabrigde Uinervtisy (Cambridge University) effect from the psycholinguistics literature has demonstrated a robust word processing mechanism in humans, where jumbled words (e.g. Cmabrigde / Cambridge) are recognized with little cost. Inspired by the findings from the Cmabrigde Uinervtisy effect, we propose a word recognition model based on a semi-character level recursive neural network (sc...

2015
Shiliang Zhang Hui Jiang Mingbin Xu Junfeng Hou Li-Rong Dai

In this paper, we propose the new fixedsize ordinally-forgetting encoding (FOFE) method, which can almost uniquely encode any variable-length sequence of words into a fixed-size representation. FOFE can model the word order in a sequence using a simple ordinally-forgetting mechanism according to the positions of words. In this work, we have applied FOFE to feedforward neural network language mo...

2005
Rahib Hidayat Abiyev

This paper presents the development of recurrent neural network based fuzzy inference system for identification and control of dynamic nonlinear plant. The structure and algorithms of fuzzy system based on recurrent neural network are described. To train unknown parameters of the system the supervised learning algorithm is used. As a result of learning, the rules of neuro-fuzzy system are forme...

2003
Edgar N. Sanchez Jose P. perez

This chapter presents an application of neural networks to chaos synchronization. The two main methodologies, on which the approach is based, are recurrent neural networks and inverse optimal control for nonlinear systems. On the basis of the last technique, chaos is first produced by a stable recurrent neural network; an adaptive recurrent neural controller is then developed for chaos synchron...

2004
A. Steven Younger Sepp Hochreiter Peter R. Conwell

This paper introduces gradient descent methods applied to meta-leaming (leaming how to leam) in Neural Networks. Meta-leaning has been of interest in the machine leaming field for decades because of its appealing applications to intelligent agents, non-stationary time series, autonomous robots, and improved leaming algorithms. Many previous neural network-based approaches toward meta-leaming ha...

2016
Marco Fraccaro Søren Kaae Sønderby Ulrich Paquet Ole Winther

How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural generative model. The clear separation of deterministic and stochastic layers allows a structured va...

Journal: :CoRR 2015
Francesco Visin Kyle Kastner Kyunghyun Cho Matteo Matteucci Aaron C. Courville Yoshua Bengio

In this paper, we propose a deep neural network architecture for object recognition based on recurrent neural networks. The proposed network, called ReNet, replaces the ubiquitous convolution+pooling layer of the deep convolutional neural network with four recurrent neural networks that sweep horizontally and vertically in both directions across the image. We evaluate the proposed ReNet on thre...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید