نتایج جستجو برای: decision neural network training

تعداد نتایج: 1402523  

2007
Nagesh Kadaba Kendall E. Nygard Paul L. Juell Lars Kangas

A significant problem associated with application of the Back Propagation learning paradigm for pattern classification is the lack of high accuracy in generalization when the domain is large. In this paper we describe a multiple neural network system, which uses two self-organizing neural networks that work as teaching data filters (feature extractors), producing information that is used to tra...

Journal: :IEEE transactions on neural networks 2000
Anastasios D. Doulamis Nikolaos D. Doulamis Stefanos D. Kollias

A novel approach is presented in this paper for improving the performance of neural-network classifiers in image recognition, segmentation, or coding applications, based on a retraining procedure at the user level. The procedure includes: 1) a training algorithm for adapting the network weights to the current condition; 2) a maximum a posteriori (MAP) estimation procedure for optimally selectin...

2003
Nikolaos Doulamis Anastasios Doulamis Klimis Ntalianis Stefanos Kollias

A novel approach is presented in this paper for improving the performance of neural network classifiers in image recognition, segmentation or coding applications, based on a retraining procedure at the user level. The procedure includes (a) a training algorithm for adapting the network weights to the current condition, (b) a maximum a posteriori (MAP) estimation procedure for optimally selectin...

Journal: :Neurocomputing 2016
Mehdi Sajjadi Mojtaba Seyedhosseini Tolga Tasdizen

Artificial neural networks are powerful pattern classifiers. They form the basis of the highly successful and popular Convolutional Networks which offer the state-of-the-art performance on several computer visions tasks. However, in many general and non-vision tasks, neural networks are surpassed by methods such as support vector machines and random forests that are also easier to use and faste...

1998
Chengan Guo Anthony Kuh

This paper proposes a novel neural-network method for sequential detection. We first examine the optimal parametric sequential probability ratio test (SPRT) and make a simple equivalent transformation of the SPRT that makes it suitable for neural-network architectures. We then discuss how neural networks can learn the SPRT decision functions from observation data and labels. Conventional superv...

2008
J. Bastos Y. Liu

This paper compares the performance of artificial neural networks and boosted decision trees, with and without cascade training, for tagging b-jets in a collider experiment. It is shown, using a Monte Carlo simulation of WH → lνqq̄ events, that boosted decision trees outperform artificial neural networks. Furthermore, cascade training can substantially improve the performance of both boosted dec...

1993
Sam Waugh Anthony Adams

A number of different data sets are used to compare a variety of neural network training algorithms: backpropagation, quickprop, committees of backpropagation style networks and Cascade Correlation. The results are further compared with a decision tree technique, C4.5, to assess which types of problems are more suited to the different classes of inductive learning algorithms.

1997
Wolfgang Utschick Josef A. Nossek

An extension of a feedforward neural network is presented. Although utilizing linear threshold functions and a boolean function in the second layer, signal processing within the neural network is real. After mapping input vectors onto a discretization of the input space, real valued features of the internal representation of pattern are extracted. A vectorquantizer assigns a class hypothesis to...

2010
Yao-Jen Chang Chia-Lu Ho Guo-Tong Fang

This paper proposes a novel adaptive decision feedback equalizer (DFE) based on self-constructing recurrent fuzzy neural network (SRFNN) for quadrature amplitude modulation systems. Without the prior knowledge of channel characteristics, a novel training scheme containing both selfconstructing learning and back-propagation algorithms is derived for the SRFNN. The proposed DFE is compared with s...

Journal: :Neural Networks 1994
Yan Qiu Chen David W. Thomas Mark S. Nixon

This paper proposes a novel generating-shrinking algorithm, which builds and then shrinks a three-layer feed-forward neural network to achieve arbitrary classification in n-dimensional Euclidean space. The algorithm offers guaranteed convergence to a 100% correct classification rate on training patterns. Decision regions resulting from the algorithm are analytically described, so the generalisa...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید