نتایج جستجو برای: کلاسه بند adaboost
تعداد نتایج: 7051 فیلتر نتایج به سال:
AdaBoost rarely suffers from overfitting problems in low noise data cases. However, recent studies with highly noisy patterns have clearly shown that overfitting can occur. A natural strategy to alleviate the problem is to penalize the data distribution skewness in the learning process to prevent several hardest examples from spoiling decision boundaries. In this paper, we pursue such a penalty...
Concentration inequalities that incorporate variance information (such as Bernstein’s or Bennett’s inequality) are often significantly tighter than counterparts (such as Hoeffding’s inequality) that disregard variance. Nevertheless, many state of the art machine learning algorithms for classification problems like AdaBoost and support vector machines (SVMs) extensively use Hoeffding’s inequalit...
Recently, Universum data that does not belong to any class of the training data, has been applied for training better classifiers. In this paper, we address a novel boosting algorithm called UadaBoost that can improve the classification performance of AdaBoost with Universum data. UadaBoost chooses a function by minimizing the loss for labeled data and Universum data. The cost function is minim...
Modulation scheme recognition occupies a crucial position in the civil and military application. In this paper, we present boosting algorithm as an ensemble frame to achieve a higher accuracy than a single classifier. To evaluate the effect of boosting algorithm, eight common communication signals are yet to be identified. And five kinds of entropy are extracted as the training vector. And then...
Excessive number of Haar-like features and the complex threshold calculation of covariance matrix feature are two key issues of Adaboost face detection. In this paper, an efficient feature named covariance feature is proposed. The novel method divides the face image into several regions and it calculate covariance feature of any two regions. Then optimal weak classifiers will be picked out by A...
Boosted cascade of simple features, by Viola and Jones, is one of the most famous object detection frameworks. However, it suffers from a lengthy training process. This is due to the vast features space and the exhaustive search nature of Adaboost. In this paper we propose GAdaboost: a Genetic Algorithm to accelerate the training procedure through natural feature selection. Specifically, we pro...
SoftBoost is a recently presented boosting algorithm, which trades off the size of achieved classification margin and generalization performance. This paper presents a performance evaluation of SoftBoost algorithm on the generic object recognition problem. An appearance-based generic object recognition model is used. The evaluation experiments are performed using a difficult object recognition ...
Recently, AdaBoost has been widely used in many computer vision applications and has shown promising results. However, it is also observed that its classification performance is often poor when the size of the training sample set is small. In certain situations, there may be many unlabelled samples available and labelling them is costly and time-consuming. Thus it is desirable to pick a few goo...
Facial expression plays an important role in nonverbal social communication, emotion expression and affective recognition. To make the reorganization of facial expression more effectively, researchers try to recognize facial expression by the recognition of facial action units. In this paper, in order to identify lip AUs, we adopt Gabor wavelet transformation as the feature extraction method an...
AdaBoost !5] is a well-known ensemble learning algorithm that constructs its constituent or base models in sequence. A key step ill AdaBoost is constructing a distribution over the training examples to crette each base model. This distribution, represented as a vector, is constructed to be orthogonal to the vector of mistakes made by tLe previous base model in the sequence [6]. The idea is to m...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید