Warping Similarity Space in Category Learning by BackProp Nets
نویسندگان
چکیده
We report simulations with backpropagation networks trained to discriminate and then categorize a set of stimuli. The findings suggest a possible mechanism for categorical perception based on altering interstimulus
منابع مشابه
Warping Similarity Space in Category Learning by BackProp Net
We report simulations with backpropagation networks trained to discriminate and then categorize a set of stimuli. The findings suggest a possible mechanism for categorical perception based on altering interstimulus similarity.
متن کاملLearning Many Related Tasks at the Same Time with Backpropagation
Hinton [6] proposed that generalization in artificial neural nets should improve if nets learn to represent the domain's underlying regularities . Abu-Mustafa's hints work [1] shows that the outputs of a backprop net can be used as inputs through which domainspecific information can be given to the net . We extend these ideas by showing that a backprop net learning many related tasks at the sam...
متن کامل2 MECHANISMS OF MULTITASK BACKPROPWe
Hinton 6] proposed that generalization in artiicial neural nets should improve if nets learn to represent the domain's underlying regularities. Abu-Mustafa's hints work 1] shows that the outputs of a backprop net can be used as inputs through which domain-speciic information can be given to the net. We extend these ideas by showing that a backprop net learning many related tasks at the same tim...
متن کاملOverfitting in Neural Nets: Backpropagation, Conjugate Gradient, and Early Stopping
The conventional wisdom is that backprop nets with excess hidden units generalize poorly. We show that nets with excess capacity generalize well when trained with backprop and early stopping. Experiments suggest two reasons for this: 1) Overfitting can vary significantly in different regions of the model. Excess capacity allows better fit to regions of high non-linearity, and backprop often avo...
متن کامل