نتایج جستجو برای: یادگیری adaboost

تعداد نتایج: 22173  

2004
Chu-Song Chen Chang-Ming Tsai Jiun-Hung Chen Chia-Ping Chen

In this paper, we propose the post-classification scheme that is useful for improving weak-hypothesis combination of AdaBoost. The post-classification scheme allows the weak hypotheses to be combined nonlinearly, and can be shown to have a generally better performance than the original linear-combination approach in either theory or practice. The post-classification scheme provides a general pe...

Journal: :Soft Comput. 2006
José Otero Luciano Sánchez

Recently, Adaboost has been compared to greedy backfitting of extended additive models in logistic regression problems, or “Logitboost". The Adaboost algorithm has been applied to learn fuzzy rules in classification problems, and other backfitting algorithms to learn fuzzy rules in modeling problems but, up to our knowledge, there are not previous works that extend the Logitboost algorithm to l...

2013
Hiroshi Fujimura Yusuke Shinohara Takashi Masuko

This paper proposes a novel technique to exploit discriminative models with subclasses for speech recognition. Speech recognition using discriminative models has attracted much attention in the past decade. However, most discriminative models are still based on tree clustering results of HMM states. On the contrary, our proposed method, referred to as subclass AdaBoost, jointly selects optimal ...

2013
Si Qing Zhang Run Zheng

Si Qing Zhang Run Zheng 1 Zhengzhou College of Science &Technology, Zhengzhou China 450064 a [email protected] Abstract—In the traditional content-based image retrieval system, for a given query image, the number of relevant images in the database are not far outnumber correlation image. Therefore, a number of negative samples and the number of positive sample is unbalanced, the two class classifier...

2010
Jennifer Wortman Vaughan David Lei Daniel Quach Alex Lee

In this lecture we will continue our discussion of the Adaboost algorithm and derive a bound on the generalization error. We saw last time that the training error decreases exponentially with respect to the number of rounds T . However, we also want to see the performance of this algorithm on new test data. Today we will show why the Adaboost algorithm generalizes so well and why it avoids over...

2006
Vanessa Gómez-Verdejo Aníbal R. Figueiras-Vidal

The use of modified Real Adaboost ensembles by applying weighted emphasis on erroneous and critical (near the classification boundary) has been shown to lead to improved designs, both in performance and in ensemble sizes. In this paper, we propose to take advantage of the diversity among different weighted combination to build committees of modified Real Adaboost designs. Experiments show that ...

2011
Jung-Jin Lee Pyoung-Hean Lee Christof Koch Alan Yuille

Detecting text regions in natural scenes is an important part of computer vision. We propose a novel text detection algorithm that extracts six different classes features of text, and uses Modest AdaBoost with multi-scale sequential search. Experiments show that our algorithm can detect text regions with a f= 0.70, from the ICDAR 2003 datasets which include images with text of various fonts, si...

2003
Rodrigo Verschae Javier Ruiz-del-Solar

In this paper is proposed a hybrid face detector that combines the high processing speed of an Asymmetrical Adaboost Cascade Detector with the high detection rate of a Wavelet Bayesian Detector. This integration is achieved by incorporating this last detector in the middle stages of the cascade detector. Results of the application of the proposed detector to a standard face detection database a...

2008
Roman Juránek

A common approach to pattern recognition and object detection is to use a statistical classifier. Widely used method is AdaBoost or its modifications which yields outstanding results in certain tasks like face detection. The aim of this work was to build real-time system for detection of dogs for surveillance purposes. The author of this paper thus explored the possibility that the AdaBoost bas...

2004
Balázs Kégl

We have recently proposed an extension of ADABOOST to regression that uses the median of the base regressors as the final regressor. In this paper we extend theoretical results obtained for ADABOOST to median boosting and to its localized variant. First, we extend recent results on efficient margin maximizing to show that the algorithm can converge to the maximum achievable margin within a pres...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید