High capacity recurrent associative memories

نویسندگان

  • Neil Davey
  • Stephen P. Hunt
  • Rod Adams
چکیده

Various algorithms for constructing weight matrices for Hopfield-type associative memories are reviewed, including ones with much higher capacity than the basic model. These alternative algorithms either iteratively approximate the projection weight matrix or use simple perceptron learning. An experimental investigation of the performance of networks trained by these algorithms is presented, including measurements of capacity, training time and their ability to correct corrupted versions of the training patterns.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Weight Long Short-term Memory

Associative memory using fast weights is a short-term memory mechanism that substantially improves the memory capacity and time scale of recurrent neural networks (RNNs). As recent studies introduced fast weights only to regular RNNs, it is unknown whether fast weight memory is beneficial to gated RNNs. In this work, we report a significant synergy between long short-term memory (LSTM) networks...

متن کامل

Bipolar spectral associative memories

Nonlinear spectral associative memories are proposed as quantized frequency domain formulations of nonlinear, recurrent associative memories in which volatile network attractors are instantiated by attractor waves. In contrast to conventional associative memories, attractors encoded in the frequency domain by convolution may be viewed as volatile online inputs, rather than nonvolatile, off-line...

متن کامل

A neural network with a single recurrent unit for associative memories based on linear optimization

Recently, some continuous-time recurrent neural networks have been proposed for associative memories based on optimizing linear or quadratic programming problems. In this paper, a simple and efficient neural network with a single recurrent unit is proposed for realizing associative memories. Compared with the existing neural networks for associative memories, the main advantage of the proposed ...

متن کامل

Re-encoding of associations by recurrent plasticity increases memory capacity

Recurrent networks have been proposed as a model of associative memory. In such models, memory items are stored in the strength of connections between neurons. These modifiable connections or synapses constitute a shared resource among all stored memories, limiting the capacity of the network. Synaptic plasticity at different time scales can play an important role in optimizing the representati...

متن کامل

Neural Network Applications F1.4 Associative memory

This section considers how neural networks can be used as associative memory devices. It first describes what an associative memory is, and then moves on to describe associative memories based on feedforward neural networks and associative memories based on recurrent networks. The section also describes associative memory systems based on cognitive models. It also highlights the ability of neur...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 62  شماره 

صفحات  -

تاریخ انتشار 2004