نتایج جستجو برای: perceptrons
تعداد نتایج: 1707 فیلتر نتایج به سال:
Most properties of learning and generalization in linear perceptrons can be derived from the average response function G. We present a method for calculating G using only simple matrix identities and partial diierential equations. Using this method, we rst rederive the known result for G in the thermodynamic limit of perceptrons of innnite size N, which has previously been calculated using repl...
Significant advances in speech separation have been made by formulating it as a classification problem, where the desired output is the ideal binary mask (IBM). Previous work does not explicitly model the correlation between neighboring time-frequency units and standard binary classifiers are used. As one of the most important characteristics of speech signal is its temporal dynamics, the IBM c...
The Feedforward Multilayer Perceptrons network is a widely used model in Artificial Neural Network using the backpropagation algorithm for real world data. There are two common ways to construct Feedforward Multilayer Perceptrons network, that is, either taking a large network and then pruning away the irrelevant nodes or starting from a small network and then adding new relevant nodes. An Arti...
Title of Document: APPLYING PERCEPTRONS TO SPECULATION IN COMPUTER ARCHITECTURE Michael David Black, Ph.D., 2007 Directed By: Professor Manoj Franklin, Electrical and Computer Engineering Speculation plays an ever-increasing role in optimizing the execution of programs in computer architecture. Speculative decision-makers are typically required to have high speed and small size, thus limiting t...
This paper investigates the possibility of describing vowels phonetically using an automated method. Models of the phonetic dimensions of the vowel space are built using two multi-layer perceptrons trained using eight cardinal vowels. The paper aims to improve the positioning of vowels in the open-close dimension by experimenting with a parameter in the model which is the parameter which contro...
Worst-case errors in linear and neural-network approximation are investigated in a more general framework of fixed versus variable-basis approximation. Such errors are compared for balls in certain norms, tailored to sets of variablebasis functions. The tools for estimation of rates of variablebasis approximation are applied to sets of functions either computable by perceptrons with periodic or...
There has been an increasing interest in the applicability of neural networks in disparate domains. In this paper, we describe the use of multi-layered perceptrons, a type of neural network topology, for financial classification problems, with promising results. Backpropagation, which is the learning algorithm most often used in multilayered perceptrons, however, is inherently an inefficient se...
seismic facies analysis (sfa) aims to classify similar seismic traces based on amplitude, phase,frequency, and other seismic attributes. sfa has proven useful in interpreting seismic data, allowingsignificant information on subsurface geological structures to be extracted. while facies analysis hasbeen widely investigated through unsupervised-classification-based studies, there are few casesass...
We investigate the learnability, under the uniform distribution, of deterministic and probabilistic neural concepts that can be represented as simple combinations of nonoverlapping perceptrons with binary weights. Two perceptrons are said to be nonoverlapping if they do not share any input variables. In the deterministic case, we investigate, within the distribution-specific PAC model, the lear...
We consider two-layered perceptions consisting of N binary input units, K binary hidden units and one binary output unit, in the limit N >> K > or = 1. We prove that the weights of a regular irreducible network are uniquely determined by its input-output map up to some obvious global symmetries. A network is regular if its K weight vectors from the input layer to the K hidden units are linearly...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید