نتایج جستجو برای: single layer perceptron
تعداد نتایج: 1125882 فیلتر نتایج به سال:
We present a detailed experimental comparison of the pocket algorithm thermal percep tron and barycentric correction procedure algorithms that most commonly used algorithms for training threshold logic units TLUs Each of these algorithms represent stable variants of the standard perceptron learning rule in that they guarantee convergence to zero classi cation errors on datasets that are linearl...
In this paper the geometric formulation of the single layer perceptron weight optimization problem previously described by Coetzee et al. (1993, 1996) is combined with results from other researchers on nonconvex set projections to describe sufficient conditions for uniqueness of weight solutions. It is shown that the perceptron data surface is pseudoconvex and has infinite folding, allowing for...
A general problem in model selection is to obtain the right parameters that make a model t observed data. If the model selected is a Multilayer Perceptron (MLP) trained with Backpropagation (BP), it is necessary to nd appropriate initial weights and learning parameters. This paper proposes a method that combines Simulated Annealing (SimAnn) and BP to train MLPs with a single hidden layer, terme...
Memristors are memory resistors that promise the efficient implementation of synaptic weights in artificial neural networks. Whereas demonstrations of the synaptic operation of memristors already exist, the implementation of even simple networks is more challenging and has yet to be reported. Here we demonstrate pattern classification using a single-layer perceptron network implemented with a m...
The notion of linear separability is widely used in machine learning research. Learning algorithms that use this concept to learn include neural networks (Single Layer Perceptron and Recursive Deterministic Perceptron), and kernel machines (Support Vector Machines). Several algorithms for testing linear separability exist. Some of these methods are computationally intense. Also, several of them...
The work presents the results of an investigation conducted to compare the performances of the Multi Layer Perceptron (MLP) and the Nearest Neighbor (NN) classifier for handwritten numeral recognition problem. The comparison is drawn in terms of the recognition performance and the computational requirements of the individual classifiers. The results show that a two-layer perceptron performs com...
In this paper, the application of neural networks to study the design of short-term load forecasting (STLF) Systems for Illam state located in west of Iran was explored. One important architecture of neural networks named Multi-Layer Perceptron (MLP) to model STLF systems was used. Our study based on MLP was trained and tested using three years (2004-2006) data. The results show that MLP networ...
This paper analyzes the behavior of a variety of tracking algorithms for single-layer threshold networks in the presence of random drift. We use a system identification model to model a target network where weights slowly change and a tracking network. Tracking algorithms are divided into conservative and nonconservative algorithms. For a random drift rate of , we find upper bounds for the gene...
We consider two-layered perceptions consisting of N binary input units, K binary hidden units and one binary output unit, in the limit N >> K > or = 1. We prove that the weights of a regular irreducible network are uniquely determined by its input-output map up to some obvious global symmetries. A network is regular if its K weight vectors from the input layer to the K hidden units are linearly...
A novel speech separation structure which simulates the cocktail party e ect using a modi ed iterative Wiener lter and a multi-layer perceptron neural network is presented. The neural network is used as a speaker recognition system to control the iterative Wiener lter. The neural network is a modi ed perceptron with a hidden layer using feature data extracted from LPC cepstral analysis. The pro...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید