نتایج جستجو برای: adaboost classifier

تعداد نتایج: 45412  

Journal: :Pattern Recognition Letters 2008
Juan José Rodríguez Diez Jesús Maudes

Boosting is a set of methods for the construction of classifier ensembles. The differential feature of these methods is that they allow to obtain a strong classifier from the combination of weak classifiers. Therefore, it is possible to use boosting methods with very simple base classifiers. One of the most simple classifiers are decision stumps, decision trees with only one decision node. This...

Journal: :IJPRAI 2006
Yijun Sun Sinisa Todorovic Jian Li

AdaBoost rarely suffers from overfitting problems in low noise data cases. However, recent studies with highly noisy patterns have clearly shown that overfitting can occur. A natural strategy to alleviate the problem is to penalize the data distribution skewness in the learning process to prevent several hardest examples from spoiling decision boundaries. In this paper, we pursue such a penalty...

Journal: :JCP 2014
Rui Li Changfeng Li

Excessive number of Haar-like features and the complex threshold calculation of covariance matrix feature are two key issues of Adaboost face detection. In this paper, an efficient feature named covariance feature is proposed. The novel method divides the face image into several regions and it calculate covariance feature of any two regions. Then optimal weak classifiers will be picked out by A...

Journal: :JSW 2012
Xianmei Wang Yuyu Liang Xiujie Zhao Zhiliang Wang

Facial expression plays an important role in nonverbal social communication, emotion expression and affective recognition. To make the reorganization of facial expression more effectively, researchers try to recognize facial expression by the recognition of facial action units. In this paper, in order to identify lip AUs, we adopt Gabor wavelet transformation as the feature extraction method an...

Journal: :Pattern Recognition Letters 2008
Chun-Xia Zhang Jiang-She Zhang

This paper presents a novel ensemble classifier generation technique RotBoost, which is constructed by combining Rotation Forest and AdaBoost. The experiments conducted with 36 real-world data sets available from the UCI repository, among which a classification tree is adopted as the base learning algorithm, demonstrate that RotBoost can generate ensemble classifiers with significantly lower pr...

2004
Cynthia Rudin Robert E. Schapire Ingrid Daubechies

We study two boosting algorithms, Coordinate Ascent Boosting and Approximate Coordinate Ascent Boosting, which are explicitly designed to produce maximum margins. To derive these algorithms, we introduce a smooth approximation of the margin that one can maximize in order to produce a maximum margin classifier. Our first algorithm is simply coordinate ascent on this function, involving a line se...

2005
Piyanuch Silapachote Deepak R. Karuppiah

We propose a classification technique for face expression recognition using AdaBoost that learns by selecting the relevant global and local appearance features with the most discriminating information. Selectivity reduces the dimensionality of the feature space that in turn results in significant speed up during online classification. We compare our method with another leading margin-based clas...

2003
Jiaming Li Geoff Poulton Ying Guo Rong-yu Qiao

For face recognition, face feature selection is an important step. Better features should result in better performance. This paper describes a robust face recognition algorithm using multiple face region features selected by the AdaBoost algorithm. In conventional face recognition algorithms, the face region is dealt with as a whole. In this paper we show that dividing a face into a number of s...

2006
Etienne Grossmann

We extend the framework of Adaboost so that it builds a smoothed decision tree rather than a neural network. The proposed method, “Adatree 2”, is derived from the assumption of a probabilistic observation model. It avoids the problem of over-fitting that appears in other tree-growing methods by reweighing the training examples, rather than splitting the training dataset at each node. It differs...

2013
Hasan Fleyeh Erfan Davami

This paper presents a mult i-class AdaBoost based on incorporating an ensemble of binary AdaBoosts which is organized as Binary Decision Tree (BDT). It is proved that binary AdaBoost is ext remely successful in producing accurate classification but it does not perform very well for multi-class problems. To avoid this performance degradation, the multi-class problem is div ided into a number of ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید