نتایج جستجو برای: recurrent input
تعداد نتایج: 345825 فیلتر نتایج به سال:
Recurrent neural networks are powerful sequence learners. They are able to incorporate context information in a flexible way, and are robust to localised distortions of the input data. These properties make them well suited to sequence labelling, where input sequences are transcribed with streams of labels. Long short-term memory is an especially promising recurrent architecture, able to bridge...
 Abstract: In this paper, Artificial Neural Network (ANN) was used for modeling the nonlinear structure of a debutanizer column in a refinery gas process plant. The actual input-output data of the system were measured in order to be used for system identification based on root mean square error (RMSE) minimization approach. It was shown that the designed recurrent neural network is able to pr...
An adaptive input-output linearization method for general nonlinear systems is developed without using states of the system. Another key feature of this structure is the fact that, it does not need model of the system. In this scheme, neurolinearizer has few weights, so it is practical in adaptive situations. Online training of neuroline...
Recurrent neural networks have been established as a general tool for tting sequential input=output data. On the other hand, Fourier analysis is a useful tool for time series analysis. In this paper, these two elds are linked together to form a new interpretation to recurrent networks for time series prediction. Fourier analysis of a time series is applied to construct a complex-valued recurren...
This paper describes the use of recurrent neural networks for phoneme recognition. Spectral, Bark scaled, and cepstral representations for input to the networks are discussed, and an additional input based on algorithmically defined features is described that can also be used as input for phoneme recognition. Neural networks with recurrent hidden layers of various sizes are trained to determine...
There exist many problem domains where the interpretability of neural network models is essential for deployment. Here we introduce a recurrent architecture composed of input-switched affine transformations – in other words an RNN without any explicit nonlinearities, but with inputdependent recurrent weights. This simple form allows the RNN to be analyzed via straightforward linear methods: we ...
Recurrent neural networks are frequently used in cognitive science community for modeling linguistic structures. More or less intensive training process is usually performed but several works showed that untrained recurrent networks initialized with small weights can be also successfully used for this type of tasks. In this work we demonstrate that the state space organization of untrained recu...
We introduce segmental recurrent neural networks (SRNNs) which define, given an input sequence, a joint probability distribution over segmentations of the input and labelings of the segments. Representations of the input segments (i.e., contiguous subsequences of the input) are computed by encoding their constituent tokens using bidirectional recurrent neural nets, and these “segment embeddings...
Neural networks can be classified into recurrent and nonrecurrent categories. Nonrecurrent (feedforward) networks have no feedback elements; the output is calculated directly from the input through feedforward connections. In recurrent networks the output depends not only on the current input to the network, but also on the current or previous outputs or states of the network. For this reason, ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید