نتایج جستجو برای: recurrent network

تعداد نتایج: 785363  

Journal: :Journal of Physics A: Mathematical and General 2000

1997
Bradley Tonkes

Natural languages exhibit context-free properties such as center-embedded clauses. Recent research has sought a model that performs on these features with human-like inconsistencies, rather than like traditional discrete automata. This search has recently focussed on recurrent neural networks. It has been shown theoretically that recurrent networks are computationally as powerful as Turing mach...

2004
Wei Sun Yaonan Wang

Abstract— A kind of recurrent fuzzy neural network (RFNN) is constructed by using recurrent neural network (RNN) to realize fuzzy inference. In this kind of RFNN, temporal relations are embedded in the network by adding feedback connections on the first layer of the network. And a RFNN based adaptive control (RFNNBAC) is proposed, in which, two RFNN are used to identify and control plant respec...

Journal: :CoRR 2016
Yangfeng Ji Gholamreza Haffari Jacob Eisenstein

This paper presents a novel latent variable recurrent neural network architecture for jointly modeling sequences of words and (possibly latent) discourse relations that link adjacent sentences. A recurrent neural network generates individual words, thus reaping the benefits of discriminatively-trained vector representations. The discourse relations are represented with a latent variable, which ...

Journal: :Entropy 2017
Norbert Michael Mayer

Recurrent networks that have transfer functions that fulfill the Lipschitz continuity with L = 1, may be echo state networks if certain limitations on the recurrent connectivity are applied. Initially it has been shown that it is sufficient if the largest singular value of the recurrent connectivity S is smaller than 1. The main achievement of this paper is a proof under which conditions the ne...

1993
Kenji Doya

Asymptotic behavior of a recurrent neural network changes qualitatively at certain points in the parameter space, which are known as \bifurcation points". At bifurcation points, the output of a network can change discontinuously with the change of parameters and therefore convergence of gradient descent algorithms is not guaranteed. Furthermore, learning equations used for error gradient estima...

Journal: :CoRR 2008
Santiago Fernández Alex Graves Jürgen Schmidhuber

We compare the performance of a recurrent neural network with the best results published so far on phoneme recognition in the TIMIT database. These published results have been obtained with a combination of classifiers. However, in this paper we apply a single recurrent neural network to the same task. Our recurrent neural network attains an error rate of 24.6%. This result is not significantly...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید