High Performance Associative Memories and Structured Weight Dilution
نویسنده
چکیده
The consequences of two techniques for symmetrically diluting the weights of the standard Hopfield architecture associative memory model, trained using a non-Hebbian learning rule, are examined. This paper reports experimental investigations into the effect of dilution on factors such as: pattern stability and attractor performance. It is concluded that these networks maintain a reasonable level of performance at fairly high dilution rates. Key-Words Associative Memory, Hopfield Networks, Weight Dilution, Capacity, Basins of Attraction, Perceptron Learning.
منابع مشابه
Non-Random Weight Dilution in High Performance Associative Memories
S.P Turvey, S.P.Hunt, N.Davey, R.J.Frank Department of Computer Science, University of Hertfordshire, College Lane, Hatfield, AL10 9AB. United Kingdom email: {s.p.turvey, s.p.hunt, n.davey, r.j.frank}@herts.ac.uk Abstract The consequences of two techniques for symmetrically diluting the weights of the standard Hopfield architecture associative memory model, trained using a non-Hebbian learning ...
متن کاملHigh Performance Associative Memory and Weight Dilution
The consequences of diluting the weights of the standard Hopfield architecture associative memory model, trained using perceptron like learning rules, is examined. A proportion of the weights of the network are removed; this can be done in a symmetric and asymmetric way and both methods are investigated. This paper reports experimental investigations into the consequences of dilution in terms o...
متن کاملHigh capacity recurrent associative memories
Various algorithms for constructing weight matrices for Hopfield-type associative memories are reviewed, including ones with much higher capacity than the basic model. These alternative algorithms either iteratively approximate the projection weight matrix or use simple perceptron learning. An experimental investigation of the performance of networks trained by these algorithms is presented, in...
متن کاملNoise-Enhanced Associative Memories
Recent advances in associative memory design through structured pattern sets and graph-based inference algorithms allow reliable learning and recall of exponential numbers of patterns. Though these designs correct external errors in recall, they assume neurons compute noiselessly, in contrast to highly variable neurons in hippocampus and olfactory cortex. Here we consider associative memories w...
متن کاملFast Weight Long Short-term Memory
Associative memory using fast weights is a short-term memory mechanism that substantially improves the memory capacity and time scale of recurrent neural networks (RNNs). As recent studies introduced fast weights only to regular RNNs, it is unknown whether fast weight memory is beneficial to gated RNNs. In this work, we report a significant synergy between long short-term memory (LSTM) networks...
متن کامل