نتایج جستجو برای: boosting

تعداد نتایج: 14818  

2009
Mohamad Adnan Al-Alaoui

The relation of the Al-Alaoui pattern recognition algorithm to the boosting and bagging approaches to pattern recognition is delineated. It is shown that the Al-Alaoui algorithm shares with bagging and boosting the concepts of replicating and weighting instances of the training set. Additionally it is shown that the Al-Alaoui algorithm provides a Mean Square Error, MSE, asymptotic Bayesian appr...

Journal: :Research Journal of Applied Sciences, Engineering and Technology 2013

2015
Alina Beygelzimer Elad Hazan Satyen Kale Haipeng Luo

We extend the theory of boosting for regression problems to the online learning setting. Generalizing from the batch setting for boosting, the notion of a weak learning algorithm is modeled as an online learning algorithm with linear loss functions that competes with a base class of regression functions, while a strong learning algorithm is an online learning algorithm with smooth convex loss f...

2007
Xiaofeng Yu

We propose a high-performance cascaded hybrid model for Chinese NER. Firstly, we use Boosting, a standard and theoretically wellfounded machine learning method to combine a set of weak classifiers together into a base system. Secondly, we introduce various types of heuristic human knowledge into Markov Logic Networks (MLNs), an effective combination of first-order logic and probabilistic graphi...

2001
Wenxin Jiang

This is a survey of some theoretical results on boosting obtained from an analogous treatment of some regression and classi cation boosting algorithms. Some related papers include [J99] and [J00a,b,c,d], which is a set of (mutually overlapping) papers concerning the assumption of weak hypotheses, behavior of generalization error in the large time limit and during the process of boosting, compar...

2005
Aloísio Carlos de Pina Gerson Zaverucha

Boosting is one of the most popular methods for constructing ensembles. The objective of this work is to present a boosting algorithm for regression based on the Regressor-Boosting algorithm, in which we propose the use of REC curves in order to select a good threshold value, so that only residuals greater than that value are considered as errors. The algorithm was empirically evaluated and its...

2017
A. Mayr

Objectives: Component-wise boosting algorithms have evolved into a popular estimation scheme in biomedical regression settings. The iteration number of these algorithms is the most important tuning parameter to optimize their performance. To date, no fully automated strategy for determining the optimal stopping iteration of boosting algorithms has been proposed. Methods: We propose a fully data...

2010
Ashok Venkatesan Narayanan C. Krishnan Sethuraman Panchanathan

Concept drift is a phenomenon typically experienced when data distributions change continuously over a period of time. In this paper we propose a cost-sensitive boosting approach for learning under concept drift. The proposed methodology estimates relevance costs of ‘old’ data samples w.r.t. to ‘newer’ samples and integrates it into the boosting process. We experiment this methodology on usenet...

2006
Osamu Watanabe

We discuss algorithmic aspects of boosting techniques, such as Majority Vote Boosting [Fre95], AdaBoost [FS97], and MadaBoost [DW00a]. Considering a situation where we are given a huge amount of examples and asked to find some rule for explaining these example data, we show some reasonable algorithmic approaches for dealing with such a huge dataset by boosting techniques. Through this example, ...

2005
Alexandru Niculescu-Mizil Rich Caruana

Boosted decision trees typically yield good accuracy, precision, and ROC area. However, because the outputs from boosting are not well calibrated posterior probabilities, boosting yields poor squared error and cross-entropy. We empirically demonstrate why AdaBoost predicts distorted probabilities and examine three calibration methods for correcting this distortion: Platt Scaling, Isotonic Regre...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید