Temporal Hidden Hopfield Models
نویسندگان
چکیده
Many popular probabilistic models for temporal sequences assume simple hidden dynamics or low dimensionality of discrete variables. For higher dimensional discrete hidden variables, recourse is often made to approximate mean field theories, which to date have been applied to models with only simple hidden unit dynamics. We consider a class of models in which the discrete hidden space is defined by parallel dynamics of densely connected high-dimensional stochastic Hopfield networks. For these Hidden Hopfield Models (HHMs), mean field methods are derived for learning discrete and continuous temporal sequences. We discuss applications of HHMs to classification and reconstruction of non-stationary time series. We also demonstrate a few problems (learning of incomplete binary sequences and reconstruction of 3D occupancy graphs) where distributed discrete hidden space representation may be useful. We show that while these problems cannot be easily solved by other dynamic belief networks, they are efficiently addressed by HHMs.
منابع مشابه
Approximate Learning in Temporal Hidden Hopfield Models
Many popular probabilistic models for temporal sequences assume simple hidden dynamics or low-dimensionality of discrete variables. For higher dimensional discrete hidden variables, recourse is often made to approximate mean field theories, which to date have been applied to models with only simple hidden unit dynamics. We consider a class of models in which the discrete hidden space is defined...
متن کاملTransient hidden chaotic attractors in a Hopfield neural system
In this letter we unveil the existence of transient hidden coexisting chaotic attractors, in a simplified Hopfield neural network with three neurons. keyword Hopfield neural network; Transient hidden chaotic attractor; Limit cycle
متن کاملLearning Symmetry Groups with Hidden Units: beyond the Pergeptron
Learning to recognize mirror, rotational and translational symmetries is a difficult problem for massively-parallel network models. These symmetries cannot be learned by first-order perceptrons or Hopfield networks, which have no means for incorporating additional adaptive units that are hidden from the input and output layers. We demonstrate that the Boltzmann learning algorithm is capable of ...
متن کاملNetworks of spiking neurons can emulate arbitrary Hopfield nets in temporal coding
A theoretical model for analogue computation in networks of spiking neurons with temporal coding is introduced and tested through simulations in GENESIS. It turns out that the use of multiple synapses yields very noise robust mechanisms for analogue computations via the timing of single spikes in networks of detailed compartmental neuron models. In this way, one arrives at a method for emulatin...
متن کاملComplex-Valued Boltzmann Manifold
These days we can get massive information and it is hard to deal with it without computers. Machine learning is effective for computers to manage massive information. Machine learning uses various learning machine models, for instance, decision trees, Bayesian Networks, Support Vector Machine, Hidden Markov Model, normal mixed distributions, neural networks and so on. Some of them are stochasti...
متن کامل