نتایج جستجو برای: perceptrons
تعداد نتایج: 1707 فیلتر نتایج به سال:
We investigate the network complexity of multi-layered perceptrons for solving exactly a given problem. We limit our study to the class of combinatorial optimization problems. It is shown how these problems can be reformulated as binary classification problems and how they can be solved by multi-layered perceptrons.
Proper initialization is one of the most important prerequisites for fast convergence of feed-forward neural networks like high order and multilayer perceptrons. This publication aims at determining the optimal value of the initial weight v ariance (or range), which is the principal parameter of random weight initialization methods for both types of neural networks. An overview of random weight...
This paper presents a new technique for value prediction. It uses perceptrons, one of the simplest neural networks, for prediction of instruction output values. Perceptrons have been shown to be highly effective for conditional branch prediction. Current value predictors use two-bit saturating counters for value prediction and involve an exponential increase in hardware resources. This limits t...
A structure composed of local linear perceptrons for approximating global class discriminants is investigated. Such local linear models may be combined in a cooperative or competitive way. In the cooperative model, a weighted sum of the outputs of the local perceptrons is computed where the weight is a function of the distance between the input and the position of the local perceptron. In the c...
We consider the sample complexity of concept learning when we classify by using a fixed Boolean function of the outputs of a number of different classifiers. Here, we take into account the ‘margins’ of each of the constituent classifiers. A special case is that in which the constituent classifiers are linear threshold functions (or perceptrons) and the fixed Boolean function is the majority fun...
We describe a system of thousands of binary perceptrons with coarse-oriented edges as input that is able to recognize shapes, even in a context with hundreds of classes. The perceptrons have randomized feedforward connections from the input layer and form a recurrent network among themselves. Each class is represented by a prelearned attractor (serving as an associative hook) in the recurrent n...
For many neural network models that are based on perceptrons, the number of activity patterns that can be classified is limited by the number of plastic connections that each neuron receives, even when the total number of neurons is much larger. This poses the problem of how the biological brain can take advantage of its huge number of neurons given that the connectivity is extremely sparse, es...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید