Learning to See Rotation and Dilation with a Hebb Rule
نویسندگان
چکیده
Previous work (M.I. Sereno, 1989; cf. M.E. Sereno, 1987) showed that a feedforward network with area V1-like input-layer units and a Hebb rule can develop area MT-like second layer units that solve the aperture problem for pattern motion. The present study extends this earlier work to more complex motions. Saito et al. (1986) showed that neurons with large receptive fields in macaque visual area MST are sensitive to different senses of rotation and dilation, irrespective of the receptive field location of the movement singularity. A network with an MT-like second layer was trained and tested on combinations of rotating, dilating, and translating patterns. Third-layer units learn to detect specific senses of rotation or dilation in a position-independent fashion, despite having position-dependent direction selectivity within their receptive fields.
منابع مشابه
Emergence of Position-Independent Detectors of Sense of Rotation and Dilation with Hebbian Learning: An Analysis
We previously demonstrated that it is possible to learn position-independent responses to rotation and dilation by filtering rotations and dilations with different centers through an input layer with MT-like speed and direction tuning curves and connecting them to an MSTlike layer with simple Hebbian synapses (Sereno and Sereno 1991). By analyzing an idealized version of the network with broade...
متن کاملThe Hebb Rule for Synaptic Plasticity: Algorithms and Implementations
In 1949 Donald Hebb published "The Organization of Behavior," in which he introduced several hypotheses about the neural substrate of learning and memory, including the Hebb learning rule or Hebb synapse. At that time very little was known about neural mechanisms of plasticity at the molecular and cellular levels. The primary data on which Hebb formulated his hypotheses was Golgi material, prov...
متن کاملActive Learning in Recurrent Neural Networks Facilitated by a Hebb-like Learning Rule with Memory
We demonstrate in this article that a Hebb-like learning rule with memory paves the way for active learning in the context of recurrent neural networks. We compare active with passive learning and a Hebb-like learning rule with and without memory for the problem of timing to be learned by the neural network. Moreover, we study the influence of the topology of the recurrent neural network. Our r...
متن کاملLearning structured data from unspeci c
We show that a straightforward extension of a simple learning model based on the Hebb rule, the previously introduced Association-Reinforcement-Hebb-Rule, can cope with \de-layed", unspeciic reinforcement also in the case of structured data and lead to perfect generalization .
متن کاملModeling Hebb Learning Rule for Unsupervised Learning
This paper presents to model the Hebb learning rule and proposes a neuron learning machine (NLM). Hebb learning rule describes the plasticity of the connection between presynaptic and postsynaptic neurons and it is unsupervised itself. It formulates the updating gradient of the connecting weight in artificial neural networks. In this paper, we construct an objective function via modeling the He...
متن کامل