On the Maximum Storage Capacity of the Hopfield Model
نویسندگان
چکیده
Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfield neural network, it has been shown in the literature that the retrieval errors diverge when the number of stored memory patterns (P) exceeds a fraction (≈ 14%) of the network size N. In this paper, we study the storage performance of a generalized Hopfield model, where the diagonal elements of the connection matrix are allowed to be different from zero. We investigate this model at finite N. We give an analytical expression for the number of retrieval errors and show that, by increasing the number of stored patterns over a certain threshold, the errors start to decrease and reach values below unit for P ≫ N. We demonstrate that the strongest trade-off between efficiency and effectiveness relies on the number of patterns (P) that are stored in the network by appropriately fixing the connection weights. When P≫N and the diagonal elements of the adjacency matrix are not forced to be zero, the optimal storage capacity is obtained with a number of stored memories much larger than previously reported. This theory paves the way to the design of RNN with high storage capacity and able to retrieve the desired pattern without distortions.
منابع مشابه
محاسبه ظرفیت شبکه عصبی هاپفیلد و ارائه روش عملی افزایش حجم حافظه
The capacity of the Hopfield model has been considered as an imortant parameter in using this model. In this paper, the Hopfield neural network is modeled as a Shannon Channel and an upperbound to its capacity is found. For achieving maximum memory, we focus on the training algorithm of the network, and prove that the capacity of the network is bounded by the maximum number of the ortho...
متن کاملStorage Capacity of Kernel Associative Memories
This contribution discusses the thermodynamic phases and storage capacity of an extension of the Hopfield-Little model of associative memory via kernel functions. The analysis is presented for the case of polynomial and Gaussian kernels in a replica symmetry ansatz. As a general result we found for both kernels that the storage capacity increases considerably compared to the Hopfield-Little model.
متن کاملThe storage capacity of the Hopfield model and moderate deviation principles
This note relates the storage capacity of the Hopfield model of neural networks to the existence of a moderate deviation principle for the empirical correlation of the patterns. This moderate deviation principle is satisfied under a certain condition on the moment generating function of these correlations which on the other hand can be verified in many cases by GHSand FKG-type inequalities. Exa...
متن کاملEnhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study
The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural network...
متن کاملOn the Storage Capacity of an Abstract Cortical Model with Silent Hypercolumns
In this report we investigate the storage capacity of an abstract generic attractor neural network model of the mammalian cortex. This model network has a diluted connection matrix and a fixed activity level that is independent of network size. We develop an analytical model of the storage capacity for this type of networks when they are trained with both the Willshaw and Hopfield learning-rule...
متن کامل