نتایج جستجو برای: single layer perceptron

تعداد نتایج: 1125882  

2000
Rajesh Parekh Jihoon Yang Vasant Honavar

We present a detailed experimental comparison of the pocket algorithm thermal percep tron and barycentric correction procedure algorithms that most commonly used algorithms for training threshold logic units TLUs Each of these algorithms represent stable variants of the standard perceptron learning rule in that they guarantee convergence to zero classi cation errors on datasets that are linearl...

Journal: :IEEE transactions on neural networks 1996
Frans Coetzee Virginia L. Stonick

In this paper the geometric formulation of the single layer perceptron weight optimization problem previously described by Coetzee et al. (1993, 1996) is combined with results from other researchers on nonconvex set projections to describe sufficient conditions for uniqueness of weight solutions. It is shown that the perceptron data surface is pseudoconvex and has infinite folding, allowing for...

1998
A. Prieto V. Rivas

A general problem in model selection is to obtain the right parameters that make a model t observed data. If the model selected is a Multilayer Perceptron (MLP) trained with Backpropagation (BP), it is necessary to nd appropriate initial weights and learning parameters. This paper proposes a method that combines Simulated Annealing (SimAnn) and BP to train MLPs with a single hidden layer, terme...

Journal: :Nature communications 2013
Fabien Alibart Elham Zamanidoost Dmitri B Strukov

Memristors are memory resistors that promise the efficient implementation of synaptic weights in artificial neural networks. Whereas demonstrations of the synaptic operation of memristors already exist, the implementation of even simple networks is more challenging and has yet to be reported. Here we demonstrate pattern classification using a single-layer perceptron network implemented with a m...

2007
David A. Elizondo Juan Miguel Ortiz-de-Lazcano-Lobato Ralph Birkenhead

The notion of linear separability is widely used in machine learning research. Learning algorithms that use this concept to learn include neural networks (Single Layer Perceptron and Recursive Deterministic Perceptron), and kernel machines (Support Vector Machines). Several algorithms for testing linear separability exist. Some of these methods are computationally intense. Also, several of them...

Journal: :J. Inf. Sci. Eng. 2005
Kaushik Roy Chitrita Chaudhuri Mahantapas Kundu Mita Nasipuri Dipak Kumar Basu

The work presents the results of an investigation conducted to compare the performances of the Multi Layer Perceptron (MLP) and the Nearest Neighbor (NN) classifier for handwritten numeral recognition problem. The comparison is drawn in terms of the recognition performance and the computational requirements of the individual classifiers. The results show that a two-layer perceptron performs com...

2009
Mohsen Hayati Yazdan Shirvany

In this paper, the application of neural networks to study the design of short-term load forecasting (STLF) Systems for Illam state located in west of Iran was explored. One important architecture of neural networks named Multi-Layer Perceptron (MLP) to model STLF systems was used. Our study based on MLP was trained and tested using three years (2004-2006) data. The results show that MLP networ...

1998
Anthony Kuh

This paper analyzes the behavior of a variety of tracking algorithms for single-layer threshold networks in the presence of random drift. We use a system identification model to model a target network where weights slowly change and a tracking network. Tracking algorithms are divided into conservative and nonconservative algorithms. For a random drift rate of , we find upper bounds for the gene...

Journal: :International journal of neural systems 1995
Marcelo Blatt Eytan Domany Ido Kanter

We consider two-layered perceptions consisting of N binary input units, K binary hidden units and one binary output unit, in the limit N >> K > or = 1. We prove that the weights of a regular irreducible network are uniquely determined by its input-output map up to some obvious global symmetries. A network is regular if its K weight vectors from the input layer to the K hidden units are linearly...

1997
Yuchang Cao Sridha Sridharan Miles Moody

A novel speech separation structure which simulates the cocktail party e ect using a modi ed iterative Wiener lter and a multi-layer perceptron neural network is presented. The neural network is used as a speaker recognition system to control the iterative Wiener lter. The neural network is a modi ed perceptron with a hidden layer using feature data extracted from LPC cepstral analysis. The pro...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید