منابع مشابه
Learning with recurrent neural networks
This thesis examines so-called folding neural networks as a mechanism for machine learning. Folding networks form a generalization of partial recurrent neural networks such that they are able to deal with tree structured inputs instead of simple linear lists. In particular, they can handle classical formulas { they were proposed originally for this purpose. After a short explanation of the neur...
متن کاملImitation Learning with Recurrent Neural Networks
We present a novel view that unifies two frameworks that aim to solve sequential prediction problems: learning to search (L2S) and recurrent neural networks (RNN). We point out equivalences between elements of the two frameworks. By complementing what is missing from one framework comparing to the other, we introduce a more advanced imitation learning framework that, on one hand, augments L2S’s...
متن کاملLearning grammars with recurrent neural networks
Most of language acquisition models that have been constructed so far are based on traditional AI approaches. On the other hand, artificial neural networks (ANNs), contrasting with traditional AI approaches, have many great abilities such as ability to learn, generalization capability and robustness. But they are poor in representing compositional structures and manipulating them, and are consi...
متن کاملGradual Learning of Deep Recurrent Neural Networks
Deep Recurrent Neural Networks (RNNs) achieve state-of-the-art results in many sequence-to-sequence tasks. However, deep RNNs are difficult to train and suffer from overfitting. We introduce a training method that trains the network gradually, and treats each layer individually, to achieve improved results in language modelling tasks. Training deep LSTM with Gradual Learning (GL) obtains perple...
متن کاملDeep Q-Learning With Recurrent Neural Networks
Deep reinforcement learning models have proven to be successful at learning control policies image inputs. They have, however, struggled with learning policies that require longer term information. Recurrent neural network architectures have been used in tasks dealing with longer term dependencies between data points. We investigate these architectures to overcome the difficulties arising from ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Japan Society for Fuzzy Theory and Systems
سال: 1995
ISSN: 0915-647X,2432-9932
DOI: 10.3156/jfuzzy.7.1_52