نتایج جستجو برای: recurrent input
تعداد نتایج: 345825 فیلتر نتایج به سال:
Complex neural dynamics produced by the recurrent architecture of neocortical circuits is critical to the cortex's computational power. However, the synaptic learning rules underlying the creation of stable propagation and reproducible neural trajectories within recurrent networks are not understood. Here, we examined synaptic learning rules with the goal of creating recurrent networks in which...
Two different partially recurrent neural networks st,ructured as Multi Layer Perceptrons (MLP) are inves tigated for time domain identification of a nonlinear structure. The one partially recurrent neural network has feedback of a displacement component from t,he output layer to a tapped-delay-line (TDL) input layer. The other recurrent neural network bazd on the Innovation State Space model (I...
To model time-varying nonlinear temporal dynamics in sequential data, a recurrent network capable of varying and adjusting the recurrence depth between input intervals is examined. The recurrence depth is extended by several intermediate hidden state units, and the weight parameters involved in determining these units are dynamically calculated. The motivation behind the paper lies on overcomin...
We introduce multiplicative LSTM (mLSTM), a novel recurrent neural network architecture for sequence modelling that combines the long short-term memory (LSTM) and multiplicative recurrent neural network architectures. mLSTM is characterised by its ability to have different recurrent transition functions for each possible input, which we argue makes it more expressive for autoregressive density ...
Asymptotic behavior of a recurrent neural network changes qualitatively at certain points in the parameter space, which are known as \bifurcation points". At bifurcation points, the output of a network can change discontinuously with the change of parameters and therefore convergence of gradient descent algorithms is not guaranteed. Furthermore, learning equations used for error gradient estima...
Cortical amplification is a mechanism for modifying the selectivity of neurons through recurrent interactions. Although conventionally used to enhance selectivity, cortical amplification can also broaden it, de-tuning neurons. Here we show that the spatial phase invariance of complex cell responses in primary visual cortex can arise using recurrent amplification of feedforward input. Neurons in...
با توجه به رویکرد روز افزون علم پزشکی به منابع طبیعی بویژه گیاهان دارویی و با توجه به سابقه دیرینه گیاه zataria multiflora از تیره labiatae در طب سنتی و اثر شناخته شده آنتی سپتیک آن بررسی اثر این گیاه بر بیماری شایع (ras) recurrent aphthous stomatitis موضوع این تحقیق می باشد. نتایج حاکی از وجود اختلاف معنی داری بین نتیجه درمان می باشد. و دهان شویه آویشن شیرازی دوره بیماری را به 7-4 ...
Through evolution, animals have acquired central nervous systems (CNSs), which are extremely efficient information processing devices that improve an animal’s adaptability to various environments. It has been proposed that the process of information maximization (infomax1), which maximizes the information transmission from the input to the output of a feedforward network, may provide an explana...
words, the recurrent network states are not IP states in of themselves; they require an appropriate context which can elevate them to IP-hood. This context consists of a set of input sequences and an observation method for generating outputs. While the recurrent network’s state dynamics may be described as an IFS, any IP interpretation will involve a holistic combination of the set of possible ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید