Pii: S0893-6080(99)00042-8
نویسنده
چکیده
This paper presents a theoretical analysis on the asymptotic memory capacity of the generalized Hopfield network. The perceptron learning scheme is proposed to store sample patterns as the stable states in a generalized Hopfield network. We have obtained that n 2 1 and 2n are a lower and an upper bound of the asymptotic memory capacity of the network of n neurons, respectively, which shows that the generalized Hopfield network can store the larger number of sample patterns than Hopfield network. q 1999 Elsevier Science Ltd. All rights reserved.
منابع مشابه
Neural assemblies: technical issues, analysis, and modeling
Neurons often work together to compute and process information, and neural assemblies arise from synaptic interactions and neural circuits. One way to study neural assemblies is to simultaneously record from several or many neurons and study the statistical relations among their spike trains. From this analysis researchers can try to understand the nature of the assemblies, which can also lead ...
متن کاملPii: S0893-6080(99)00058-1
The aim of the paper is to investigate the application of control schemes based on “internal models” to the stabilization of the standing posture. The computational complexities of the control problems are analyzed, showing that muscle stiffness alone is insufficient to carry out the task. The paper also re-visits the concept of the cerebellum as a Smith’s predictor. q 1999 Elsevier Science Ltd...
متن کاملPii: S0893-6080(99)00073-8
This paper presents a learning approach, i.e. negative correlation learning, for neural network ensembles. Unlike previous learning approaches for neural network ensembles, negative correlation learning attempts to train individual networks in an ensemble and combines them in the same learning process. In negative correlation learning, all the individual networks in the ensemble are trained sim...
متن کاملPii: S0893-6080(99)00025-8
In a previous work Pollack showed that a particular type of heterogeneous processor network is Turing universal. Siegelmann and Sontag (1991) showed the universality of homogeneous networks of first-order neurons having piecewise-linear activation functions. Their result was generalized by Kilian and Siegelmann (1996) to include various sigmoidal activation functions. Here we focus on a type of...
متن کاملSelf-organized hierarchical structure in a plastic network of chaotic units
Formation of a layered structure is studied in a globally coupled map of chaotic units with a plastic coupling strength that changes depending on the states of units globally and an external input. In the parameter region characterized by weakly chaotic and desynchronized dynamics, units spontaneously form a hierarchical structure due to the influence of the input. This hierarchical structure i...
متن کامل