نتایج جستجو برای: recurrent neural network rnn

تعداد نتایج: 942872  

2007
Ieroham S. Baruch Carlos-Roman Mariaca-Gaspar Israel Cruz-Vega Josefina Barrera-Cortés

This paper proposes the use of a Recurrent Neural Network (RNN) for modeling a hydrocarbon degradation process carried out in a biopile system. The proposed RNN model represents a Kalman-like filter and it has seven inputs, five outputs and twelve neurons in the hidden layer, with global and local feedbacks. The learning algorithm is a modified version of the dynamic Backpropagation one. The ob...

2017
Yu-Lun Hsieh Yung-Chun Chang Nai-Wen Chang Wen-Lian Hsu

Accurate identification of protein-protein interaction (PPI) helps biomedical researchers to quickly capture crucial information in literatures. This work proposes a recurrent neural network (RNN) model to identify PPIs. Experiments on two largest public benchmark datasets, AIMed and BioInfer, demonstrate that RNN outperforms state-of-the-art methods with relative improvements of 10% and 18%, r...

Journal: :CoRR 2015
Shiliang Zhang Hui Jiang Si Wei Li-Rong Dai

We introduce a new structure for memory neural networks, called feedforward sequential memory networks (FSMN), which can learn long-term dependency without using recurrent feedback. The proposed FSMN is a standard feedforward neural networks equipped with learnable sequential memory blocks in the hidden layers. In this work, we have applied FSMN to several language modeling (LM) tasks. Experime...

2018
Yuanhang Su Yuzhong Huang C.-C. Jay Kuo

In this work, we investigate the memory capability of recurrent neural networks (RNNs), where this capability is defined as a function that maps an element in a sequence to the current output. We first analyze the system function of a recurrent neural network (RNN) cell, and provide analytical results for three RNNs. They are the simple recurrent neural network (SRN), the long short-term memory...

2017
Jose Sotelo Soroush Mehri Kundan Kumar João Felipe Santos Kyle Kastner Aaron Courville Yoshua Bengio

We present Char2Wav, an end-to-end model for speech synthesis. Char2Wav has two components: a reader and a neural vocoder. The reader is an encoderdecoder model with attention. The encoder is a bidirectional recurrent neural network that accepts text or phonemes as inputs, while the decoder is a recurrent neural network (RNN) with attention that produces vocoder acoustic features. Neural vocode...

2018
Shuai Li Wanqing Li Chris Cook Ce Zhu Yanbo Gao

Recurrent neural networks (RNNs) have been widely used for processing sequential data. However, RNNs are commonly difficult to train due to the well-known gradient vanishing and exploding problems and hard to learn longterm patterns. Long short-term memory (LSTM) and gated recurrent unit (GRU) were developed to address these problems, but the use of hyperbolic tangent and the sigmoid action fun...

2016
Bingning Wang Kang Liu Jun Zhao

Attention based recurrent neural networks have shown advantages in representing natural language sentences (Hermann et al., 2015; Rocktäschel et al., 2015; Tan et al., 2015). Based on recurrent neural networks (RNN), external attention information was added to hidden representations to get an attentive sentence representation. Despite the improvement over nonattentive models, the attention mech...

Journal: :CoRR 2017
Wen Zhang Jiawei Hu Yang Feng Qun Liu

Even though sequence-to-sequence neural machine translation (NMT) model have achieved state-of-art performance in the recent fewer years, but it is widely concerned that the recurrent neural network (RNN) units are very hard to capture the long-distance state information, which means RNN can hardly find the feature with long term dependency as the sequence becomes longer. Similarly, convolution...

2003
R. Çağlar E. Ayaz S. Şeker E. Türkcan

This paper presents an electric power monoring based on Artificial Neural Network (ANN) for the nuclear power plants. The Recurrent Neural Networks (RNN) and the feed-forward neural network are selected for the plant modeling and anomaly detection because of the high capability of modeling for dynamic behaviors. Two types of Recurrent Neural Networks (RNN) are used. The first one Elman type of ...

2016
Yuan Gao Dorota Glowacka

This paper explores the possibility of using multiplicative gate to build two recurrent neural network structures. These two structures are called Deep Simple Gated Unit (DSGU) and Simple Gated Unit (SGU), which are structures for learning long-term dependencies. Compared to traditional Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), both structures require fewer parameters and le...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید