نتایج جستجو برای: Perceptron

تعداد نتایج: 8752  

Journal: :Pattern Recognition Letters 1998
Sheng-De Wang Tsong-Chih Hsu

It is well known that if a Boolean function is expressed in sum of products, each function can be implemented with one level of AND gates followed by an OR gate. We will prove that if each desired output of a binary function is expressed in sum of products, each desired output can be implemented with one layer of perceptron nodes followed by a perceptron node. q 1998 Elsevier Science B.V. All r...

Journal: :Electronics Letters 2002

Journal: :IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 2010

2006
Chris Fleizach

Since the perceptron was invented in 1957 by Rosenblatt, a large number of variants of the perceptron algorithm have been proposed. For this project, the main task was to find perceptron variants that worked well on the two different types of data sets, OCR and Netflix. Two variants, voted perceptron and longest survivor perceptron, were examined. Both variants were tested with several combinat...

2004
David Tarjan Kevin Skadron

We introduce a new kind of branch predictor, the hashed perceptron predictor, which merges the concepts behind the gshare and perceptron branch predictors. This is done by fetching the perceptron weights using the exclusive-or of branch addresses and branch history. This predictor can achieve superior accuracy to a path-based and a global perceptron predictor, previously the most accurate fully...

Journal: :International Journal of Quantum Information 2019

Journal: :SIAM Journal on Optimization 2012

Journal: :Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics 1994
Carmesin

An adaptive perceptron with multilinear couplings is introduced. While an adap-tive perceptron exhibits severe shortcomings if it is applied to complex tasks, this is not so for the adaptive multilinear perceptron.

Journal: :CoRR 2010
Raphael Pelossof Zhiliang Ying

We propose a focus of attention mechanism to speed up the Perceptron algorithm. Focus of attention speeds up the Perceptron algorithm by lowering the number of features evaluated throughout training and prediction. Whereas the traditional Perceptron evaluates all the features of each example, the Attentive Perceptron evaluates less features for easy to classify examples, thereby achieving signi...

2005
Shai Shalev-Shwartz Yoram Singer

We present a generalization of the Perceptron algorithm. The new algorithm performs a Perceptron-style update whenever the margin of an example is smaller than a predefined value. We derive worst case mistake bounds for our algorithm. As a byproduct we obtain a new mistake bound for the Perceptron algorithm in the inseparable case. We describe a multiclass extension of the algorithm. This exten...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید