نتایج جستجو برای: layer perceptron network
تعداد نتایج: 919520 فیلتر نتایج به سال:
in this paper, genetic programming is applied for quality improvement of noisy speech signal. therefore, a system including both spectral subtraction and genetic programming is implemented for speech enhancement. in the proposed method, first noise is reduced by spectral subtraction. in the next step, genetic programming trees are trained for more enhancement of noisy signal by mapping the sign...
Multilayer perceptrons with hard-limiting (signum) activation functions can form complex decision regions. It is well known that a three-layer perceptron (two hidden layers) can form arbitrary disjoint decision regions and a two-layer perceptron (one hidden layer) can form single convex decision regions. This paper further proves that single hidden layer feedforward neural networks (SLFN's) wit...
Since the seminal works of Engle [7] and Bollerslev [3] about heteroskedastic return series models, many extensions of their (G)ARCH models have been proposed in the literature. In particular, the functional dependence of conditional variances and the shape of the conditional distribution of returns have been varied in several ways (see [1] and [5] for an extensive overview). These two issues h...
As Wireless Sensor Networks are penetrating into the industrial domain, many research opportunities are emerging. One such essential and challenging application is that of node localization. A feed-forward neural network based methodology is adopted in this paper. The Received Signal Strength Indicator (RSSI) values of the anchor node beacons are used. The number of anchor nodes and their confi...
In the eighties the problem of the lack of an efficient algorithm to train multilayer Rosenblatt perceptrons was solved by sigmoidal neural networks and backpropagation. But should we still try to find an efficient algorithm to train multilayer hardlimit neuronal networks, a task known as a NP-Complete problem? In this work we show that this would not be a waste of time by means of a counter ex...
The Perceptron i s an adaptive linear combiner that has its output quantized to one o f two possible discrete values, and i t is the basic component of multilayer, feedforward neural networks. The leastmean-square (LMS) adaptive algorithm adjusts the internal weights to train the network to perform some desired function, such as pattern recognition. In this paper, we present an analysis o f the...
This work presents a constructive method to train the multilayer perceptron layer after layer successively and to accomplish the kernel used in the support vector machine. Data in different classes will be trained to map to distant points in each layer. This will ease the mapping of the next layer. A perfect mapping kernel can be accomplished successively. Those distant mapped points can be dis...
Heat transfer fluids have inherently low thermal conductivity that greatly limits the heat exchange efficiency. While the effectiveness of extending surfaces and redesigning heat exchange equipments to increase the heat transfer rate has reached a limit, many research activities have been carried out attempting to improve the thermal transport properties of the fluids by adding more thermally c...
In this article an attempt is made to study the applicability of a general purpose, supervised feed forward neural network with one hidden layer, namely. Radial Basis Function (RBF) neural network. It uses relatively smaller number of locally tuned units and is adaptive in nature. RBFs are suitable for pattern recognition and classification. Performance of the RBF neural network was also compar...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید