نتایج جستجو برای: recurrent neural network
تعداد نتایج: 942527 فیلتر نتایج به سال:
Patients turn to Online Health Communities not only for information on specific conditions but also for emotional support. Previous research has indicated that the progression of emotional status can be studied through the linguistic patterns of an individual’s posts. We analyze a realworld dataset from the Mental Health section of healthboards.com. Estimated from the word usages in their posts...
Some researchers claim that language acquisition is critically dependent on experiencing linguistic input in order of increasing complexity. We set out to test this hypothesis using a simple recurrent neural network (SRN) trained to predict word sequences in CHILDES, a 5-million-word corpus of speech directed to children. First, we demonstrated that age-ordered CHILDES exhibits a gradual increa...
The workflow of extracting features from images using convolutional neural networks (CNN) and generating captions with recurrent neural networks (RNN) has become a de-facto standard for image captioning task. However, since CNN features are originally designed for classification task, it is mostly concerned with the main conspicuous element of the image, and often fails to correctly convey info...
As neural network algorithms show high performance in many applications, their efficient inference on mobile and embedded systems are of great interests. When a single stream recurrent neural network (RNN) is executed for a personal user in embedded systems, it demands a large amount of DRAM accesses because the network size is usually much bigger than the cache size and the weights of an RNN a...
Convolutional neural networks (CNN) have recently achieved remarkable performance in a wide range of applications. In this research, we equip convolutional sequence-to-sequence (seq2seq) model with an efficient graph linearization technique for abstract meaning representation parsing. Our linearization method is better than the prior method at signaling the turn of graph traveling. Additionally...
We discuss advantages and disadvantages of temporally continuous neural networks in contrast to clocked ones, and continue with some “tricks of the trade” of continuous time and recurrent neural networks.
In this short paper we summarise our work [5] on training first-order recurrent neural networks (RNNs) on the abc language prediction task. We highlight the differences between incremental and non-incremental learning – with respect to success rate, generalisation performance, and characteristics of hidden unit activation.
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید