نتایج جستجو برای: adaboost learning

تعداد نتایج: 601957  

2004
Balázs Kégl Ligen Wang

In this paper we propose to combine two powerful ideas, boosting and manifold learning. On the one hand, we improve ADABOOST by incorporating knowledge on the structure of the data into base classifier design and selection. On the other hand, we use ADABOOST’s efficient learning mechanism to significantly improve supervised and semi-supervised algorithms proposed in the context of manifold lear...

2011
Oscar Amoros Sergio Escalera Anna Puig

In this paper, we propose a two-stage labeling method of large biomedical datasets through a parallel approach in a single GPU. Diagnostic methods, structures volume measurements, and visualization systems are of major importance for surgery planning, intra-operative imaging and image-guided surgery. In all cases, to provide an automatic and interactive method to label or to tag different struc...

2003
Zafer Barutçuoglu

Combining machine learning models is a means of improving overall accuracy. Various algorithms have been proposed to create aggregate models from other models, and two popular examples for classification are Bagging and AdaBoost. In this paper we examine their adaptation to regression, and benchmark them on synthetic and real-world data. Our experiments reveal that different types of AdaBoost a...

Journal: :IJPRAI 2006
Yijun Sun Sinisa Todorovic Jian Li

AdaBoost rarely suffers from overfitting problems in low noise data cases. However, recent studies with highly noisy patterns have clearly shown that overfitting can occur. A natural strategy to alleviate the problem is to penalize the data distribution skewness in the learning process to prevent several hardest examples from spoiling decision boundaries. In this paper, we pursue such a penalty...

2010
Pannagadatta K. Shivaswamy Tony Jebara

Concentration inequalities that incorporate variance information (such as Bernstein’s or Bennett’s inequality) are often significantly tighter than counterparts (such as Hoeffding’s inequality) that disregard variance. Nevertheless, many state of the art machine learning algorithms for classification problems like AdaBoost and support vector machines (SVMs) extensively use Hoeffding’s inequalit...

Journal: :Image Vision Comput. 2014
Jingsong Xu Qiang Wu Jian Zhang Zhenmin Tang

Recently, Universum data that does not belong to any class of the training data, has been applied for training better classifiers. In this paper, we address a novel boosting algorithm called UadaBoost that can improve the classification performance of AdaBoost with Universum data. UadaBoost chooses a function by minimizing the loss for labeled data and Universum data. The cost function is minim...

Journal: :Pattern Recognition Letters 2008
Chun-Xia Zhang Jiang-She Zhang

This paper presents a novel ensemble classifier generation technique RotBoost, which is constructed by combining Rotation Forest and AdaBoost. The experiments conducted with 36 real-world data sets available from the UCI repository, among which a classification tree is adopted as the base learning algorithm, demonstrate that RotBoost can generate ensemble classifiers with significantly lower pr...

Journal: :CoRR 2014
Shai Shalev-Shwartz

We describe and analyze a new boosting algorithm for deep learning called SelfieBoost. Unlike other boosting algorithms, like AdaBoost, which construct ensembles of classifiers, SelfieBoost boosts the accuracy of a single network. We prove a log(1/ ) convergence rate for SelfieBoost under some “SGD success” assumption which seems to hold in practice.

2005
Shiguang Shan Peng Yang Xilin Chen Wen Gao

This paper proposes the AdaBoost Gabor Fisher Classifier (AGFC) for robust face recognition, in which a chain AdaBoost learning method based on Bootstrap re-sampling is proposed and applied to face recognition with impressive recognition performance. Gabor features have been recognized as one of the most successful face representations, but it is too high dimensional for fast extraction and acc...

2010
Florian Baumann Katharina Ernst Arne Ehlers Bodo Rosenhahn

This paper describes a method to minimize the immense training time of the conventional Adaboost learning algorithm in object detection by reducing the sampling area. A new algorithm with respect to the geometric and accordingly the symmetric relations of the analyzed object is presented. Symmetry enhanced Adaboost (SEAdaboost) can limit the scanning area enormously, depending on the degree of ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید