منابع مشابه
Temporal-Kernel Recurrent Neural Networks
A Recurrent Neural Network (RNN) is a powerful connectionist model that can be applied to many challenging sequential problems, including problems that naturally arise in language and speech. However, RNNs are extremely hard to train on problems that have long-term dependencies, where it is necessary to remember events for many timesteps before using them to make a prediction. In this paper we ...
متن کاملRecurrent Neural Networks for Temporal Sequences Recognition
Time is the center of many human tasks. To talk, to listen, to read or to write are examples of time related tasks. To integrate the time notion into neural network is very important in order to deal with such tasks. This report presents various tasks that are based on temporal pattern processing and the different neural network architectures, simulated to tackle the problem. We examine the mai...
متن کاملSolving Linear Semi-Infinite Programming Problems Using Recurrent Neural Networks
Linear semi-infinite programming problem is an important class of optimization problems which deals with infinite constraints. In this paper, to solve this problem, we combine a discretization method and a neural network method. By a simple discretization of the infinite constraints,we convert the linear semi-infinite programming problem into linear programming problem. Then, we use...
متن کاملTemporal Activity Detection in Untrimmed Videos with Recurrent Neural Networks
This work proposes a simple pipeline to classify and temporally localize activities in untrimmed videos. Our system uses features from a 3D Convolutional Neural Network (C3D) as input to train a a recurrent neural network (RNN) that learns to classify video clips of 16 frames. After clip prediction, we post-process the output of the RNN to assign a single activity label to each video, and deter...
متن کاملNonmonotone BFGS-trained recurrent neural networks for temporal sequence processing
In this paper we propose a nonmonotone approach to recurrent neural networks training for temporal sequence processing applications. This approach allows learning performance to deteriorate in some iterations, nevertheless the network’s performance is improved over time. A self-scaling BFGS is equipped with an adaptive nonmonotone technique that employs approximations of the Lipschitz constant ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Networks
سال: 2010
ISSN: 0893-6080
DOI: 10.1016/j.neunet.2009.10.009