نتایج جستجو برای: recurrent input

تعداد نتایج: 345825  

2015
Hanjian Lai Shengtao Xiao Zhen Cui Yan Pan Chunyan Xu Shuicheng Yan

We propose a novel end-to-end deep architecture for face landmark detection, based on a deep convolutional and deconvolutional network followed by carefully designed recurrent network structures. The pipeline of this architecture consists of three parts. Through the first part, we encode an input face image to resolution-preserved deconvolutional feature maps via a deep network with stacked con...

1997
Zheng Zeng Rodney M. Goodman Padhraic Smyth

in this paper we describe a new discrete rccurrcnt neural network model with discrete external stacks for learning context-free grammars (or pushdown automata). Conventional analog recurrent networks tend to have stability problems when presented with input sirings which are longer than those used for training: the network’s internal states become merged and the string can not be correctly pars...

2006
Stefano Zappacosta Stefano Nolfi Gianluca Baldassarre

This paper presents a set of techniques that allow generating a class of testbeds that can be used to test recurrent neural networks’ capabilities of integrating information in time. In particular, the testbeds allow evaluating the capability of such models, and possibly other architectures and algorithms, of (a) categorizing different time series, (b) anticipating future signal levels on the b...

Journal: :Bio Systems 2014
Ya Guo Jinglu Tan

Pulse is often used to excite biological systems. The inputs such as irrigation, therapy, and treatments to biological systems are also equivalent to pulses. This makes the biological system behave as switched models under the function of the input. To reduce difficulty in model parameter estimation, the system could be represented as a switched linear model under the pulse excitation. In this ...

Journal: :Proceedings of the conference. Association for Computational Linguistics. Meeting 2017
Hong Yu Tsendsuren Munkhdalai

Recurrent neural networks (RNNs) process input text sequentially and model the conditional transition between word tokens. In contrast, the advantages of recursive networks include that they explicitly model the compositionality and the recursive structure of natural language. However, the current recursive architecture is limited by its dependence on syntactic tree. In this paper, we introduce...

2007
Jun-nosuke Teramae Tomoki Fukai

We study the reliability of spike output in a general class of pulse-coupled oscillators receiving a fluctuating input. Showing that this problem is equivalent to noise-induced synchronization between identical networks of oscillators, we employ the phase reduction method to analytically derive the average Lyapunov exponent of the synchronized state. We show that a transition occurs between rel...

1997
Jennifer M. Rodd

Simple recurrent networks were trained with sequences of phonemes from a corpus of Turkish words. The network's task was to predict the next phoneme. The aim of the study was to look at the representations developed within the hidden layer of the network in order to investigate the extent to which such networks can learn phonological regularities from such input. It was found that in the differ...

2015
Koichiro Yoshino Takuya Hiraoka Graham Neubig Satoshi Nakamura

We propose a dialogue state tracker based on long short term memory (LSTM) neural networks. LSTM is an extension of a recurrent neural network (RNN), which can better consider distant dependencies in sequential input. We construct a LSTM network that receives utterances of dialogue participants as input, and outputs the dialogue state of the current utterance. The input utterances are separated...

2000
Jochen J. Steil

We present local conditions for input-output stability of recurrent neural networks with time-varying parameters introduced for instance by noise or on-line adaptation. The conditions guarantee that a network implements a proper mapping from time-varying input to time-varying output functions using a local equilibrium as point of operation. We show how to calculate necessary bounds on the allow...

1999
Frances S. Chance Sacha B. Nelson

277 Although retinal input relayed through the lateral geniculate nucleus (LGN) of the thalamus clearly drives responses in the primary visual cortex (V1), LGN afferents account for only a small fraction of the synapses onto V1 neurons1–7. The primary source of synaptic input to neurons in primary visual cortex, at least in terms of numbers, is excitatory input from other nearby cortical neuron...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید