نتایج جستجو برای: recurrent network

تعداد نتایج: 785363  

Journal: :Neurocomputing 2004
Jinwen Ma

We investigate the capacity of a type of discrete-time recurrent neural network, called timedelay recurrent neural network, for storing spatio-temporal sequences. By introducing the order of a spatio-temporal sequence, the match law between a time-delay recurrent neural network and a spatio-temporal sequence has been established. It has been proved that the full order time-delay recurrent neura...

1999
Wai Sum Tang Jun Wang Yangsheng Xu

A recurrent neural network is applied for minimizing the infinity-norm of joint torques in redundant manipulators. The recurrent neural network explicitly minimizes the maximum component of joint torques in magnitude while keeping the relation between the joint torque and the end-effector acceleration satisfied. The end-effector accelerations are given to the recurrent neural network as its inp...

2013
Raja Das

In this paper, a recurrent neural network for solving linear programming problems is presented that is simpler, intuitive and fast converging. To achieve optimality in accuracy and also in computational effort, an algorithm is presented. We investigate in this paper the MATLAB Simulink modeling and simulative verification of such a recurrent neural network. Modeling and simulative results subst...

2010
Wudai Liao Jiangfeng Wang Junyan Wang

A recurrent neural network is presented for solving systems of quadratic programming problems with equality constraints involving complex-valued coefficients. The proposed recurrent neural network is asymptotically stable and able to generate optimal solutions to quadratic programs with equality constraints. An opamp based analogue circuit realization of the recurrent neural network is describe...

2007
Peter Ford Dominey Franck Ramus

Human infants are sensitive at birth to the contrasting rhythms or prosodic structures of languages, that can serve to bootstrap acquisition of grammatical structure. We present a novel recurrent network architecture that simulates this sensitivity to different temporal structures. Recurrent connections in the network are non-modifiable, while forward connections from the recurrent network to t...

Journal: :International journal of neural systems 2001
Gürsel Serpen Amol Patwardhan Jeff Geib

A trainable recurrent neural network, Simultaneous Recurrent Neural network, is proposed to address the scaling problem faced by neural network algorithms in static optimization. The proposed algorithm derives its computational power to address the scaling problem through its ability to "learn" compared to existing recurrent neural algorithms, which are not trainable. Recurrent backpropagation ...

Journal: :CoRR 2018
Yuhuang Hu Adrian Huber Jithendar Anumula Shih-Chii Liu

Plain recurrent networks greatly suffer from the vanishing gradient problem while Gated Neural Networks (GNNs) such as Long-short Term Memory (LSTM) and Gated Recurrent Unit (GRU) deliver promising results in many sequence learning tasks through sophisticated network designs. This paper shows how we can address this problem in a plain recurrent network by analyzing the gating mechanisms in GNNs...

2016
Richard Kelley

We introduce the recurrent tensor network, a recurrent neural network model that replaces the matrix-vector multiplications of a standard recurrent neural network with bilinear tensor products. We compare its performance against networks that employ long short-term memory (LSTM) networks. Our results demonstrate that using tensors to capture the interactions between network inputs and history c...

2001
J. A. Pérez-Ortiz

This paper studies the use of recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical offline grammatical inference with neural networks. Different kinds of sequence sources are considered: finitestate machines, chaotic sources, and texts in human language. Two algorithms are used for network training: real...

2011
Rohit R. Deshpande Athar Ravish Khan

In this paper, multi step ahead prediction of monthly sunspot real time series are carried out. This series is highly chaotic in nature [7]. This paper compares performance of proposed Jordan Elman Neural Network with TLRNN (Time lag recurrent neural network), and RNN (Recurrent neural network) for multi-step ahead (1, 6, 12, 18, 24) predictions. It is seen that the proposed neural network mode...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید