نتایج جستجو برای: perceptrons

تعداد نتایج: 1707  

Journal: :Neural Networks 1995
Vera Kurvoká

We examine the e ect of constraining the number of hidden units For one hidden layer networks with fairly general type of units including perceptrons with any bounded activation function and radial basis function units we show that when also the size of parameters is bounded the best approximation property is satis ed which means that there always exists a parameterization achieving the global ...

Journal: :Neurocomputing 2016
Vera Kurková Marcello Sanguineti

Model complexities of shallow (i.e., one-hidden-layer) networks representing highly varying multivariable f 1;1g-valued functions are studied in terms of variational norms tailored to dictionaries of network units. It is shown that bounds on these norms define classes of functions computable by networks with constrained numbers of hidden units and sizes of output weights. Estimates of probabili...

2005
Michal Sindlar Marco Wiering

We study the use of multi-layer perceptrons in applying artifical learning to the recognition of emotional expressions from frontal images of human faces. The perceptrons are trained using per-pixel luma data from the images’ mouth and eye areas, and map the inputs to one of 6 emotions. We compare 3 different methods for processing input information: 1) one network module for all inputs; 2) one...

2000
Kwang-Ju Lee Byoung-Tak Zhang

A method for evolving behavior-based robot controllers using genetic programming is presented. Due to their hierarchical nature, genetic programs are useful representing high-level knowledge for robot controllers. One drawback is the difficulty of incorporating sensory inputs. To overcome the gap between symbolic representation and direct sensor values, the elements of the function set in genet...

2016
Hao Shen

Despite the recent success of deep neural networks in various applications, designing and training deep neural networks is still among the greatest challenges in the field. In this work, we address the challenge of designing and training feedforward Multilayer Perceptrons (MLPs) from a smooth optimisation perspective. By characterising the critical point conditions of an MLP based loss function...

2008
Long Li Jie Yang Yan Liu Wei Wu

A learning algorithm based on a fuzzy δ rule is proposed for a fuzzy perceptron with the same topological structure as conventional linear perceptrons. The inner operations involved in the working process of this fuzzy perceptron are based on the max-min logical operations rather than conventional multiplication and summation etc. The initial values of the network weights are fixed as 1. Each v...

2007
Nikolai K. Vereshchagin NIKOLAI K. VERESHCHAGIN

In the rst part of the paper we prove that, relative to a random oracle, the class NP has innnite sets having no innnite Co-NP-subsets (Co-NP-immune sets). In the second part we prove that perceptrons separating Boolean matrices in which each row has a one from matrices in which many rows (say 99% of them) have no ones must have large size or large order. This result partially strengthens one-i...

2010
Frauke Günther

Artificial neural networks are applied in many situations. neuralnet is built to train multi-layer perceptrons in the context of regression analyses, i.e. to approximate functional relationships between covariates and response variables. Thus, neural networks are used as extensions of generalized linear models. neuralnet is a very flexible package. The backpropagation algorithm and three versio...

2010
Albert Bifet Geoff Holmes Bernhard Pfahringer Eibe Frank

Mining of data streams must balance three evaluation dimensions: accuracy, time and memory. Excellent accuracy on data streams has been obtained with Naive Bayes Hoeffding Trees—Hoeffding Trees with naive Bayes models at the leaf nodes—albeit with increased runtime compared to standard Hoeffding Trees. In this paper, we show that runtime can be reduced by replacing naive Bayes with perceptron c...

Journal: :Lecture Notes in Computer Science 2021

In this paper we investigate the relationships between a multipreferential semantics for defeasible reasoning in knowledge representation and deep neural network model. Weighted bases description logics are considered under “concept-wise” multipreference semantics. The is further extended to fuzzy interpretations exploited provide preferential interpretation of Multilayer Perceptrons, some cond...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید