Bridging LSTM Architecture and the Neural Dynamics during Reading
نویسندگان
چکیده
Recently, the long short-term memory neural network (LSTM) has attracted wide interest due to its success in many tasks. LSTM architecture consists of a memory cell and three gates, which looks similar to the neuronal networks in the brain. However, there still lacks the evidence of the cognitive plausibility of LSTM architecture as well as its working mechanism. In this paper, we study the cognitive plausibility of LSTM by aligning its internal architecture with the brain activity observed via fMRI when the subjects read a story. Experiment results show that the artificial memory vector in LSTM can accurately predict the observed sequential brain activities, indicating the correlation between LSTM architecture and the cognitive process of story reading.
منابع مشابه
CS224n Assignment 4: Machine Comprehension with Exploration on Attention Mechanism
This goal of this paper is to perform the prediction task on SQuAD dataset about reading comprehension. Given a pair of context paragraph and a question, we’ll output an answer. To do this, a model is built combining the idea of Bidirectional LSTM and attention flow mechanism. The basic architecture and setup details of the model are introduced, so do the summary of performance and error analys...
متن کاملThe Optimization of Forecasting ATMs Cash Demand of Iran Banking Network Using LSTM Deep Recursive Neural Network
One of the problems of the banking system is cash demand forecasting for ATMs (Automated Teller Machine). The correct prediction can lead to the profitability of the banking system for the following reasons and it will satisfy the customers of this banking system. Accuracy in this prediction are the main goal of this research. If an ATM faces a shortage of cash, it will face the decline of bank...
متن کاملA generalized LSTM-like training algorithm for second-order recurrent neural networks
The long short term memory (LSTM) is a second-order recurrent neural network architecture that excels at storing sequential short-term memories and retrieving them many time-steps later. LSTM's original training algorithm provides the important properties of spatial and temporal locality, which are missing from other training approaches, at the cost of limiting its applicability to a small set ...
متن کاملOn speaker adaptation of long short-term memory recurrent neural networks
Long Short-Term Memory (LSTM) is a recurrent neural network (RNN) architecture specializing in modeling long-range temporal dynamics. On acoustic modeling tasks, LSTM-RNNs have shown better performance than DNNs and conventional RNNs. In this paper, we conduct an extensive study on speaker adaptation of LSTM-RNNs. Speaker adaptation helps to reduce the mismatch between acoustic models and testi...
متن کاملTransforming the LSTM training algorithm for efficient FPGA-based adaptive control of nonlinear dynamic systems
In the absence of high-fidelity analytical descriptions of a given system to be modeled, designers of model-driven control systems rely on empirical nonlinear modeling methods such as neural networks. The particularly challenging task of modeling timevarying nonlinear dynamic systems requires from the modeling technique to capture complex internal system dynamics, dependent of long input histor...
متن کامل