نتایج جستجو برای: boosting

تعداد نتایج: 14818  

Journal: :British Journal of Sports Medicine 2004

2005
Alexander Vezhnevets Vladimir Vezhnevets

Boosting is a technique of combining a set weak classifiers to form one high-performance prediction rule. Boosting was successfully applied to solve the problems of object detection, text analysis, data mining and etc. The most and widely used boosting algorithm is AdaBoost and its later more effective variations Gentle and Real AdaBoost. In this article we propose a new boosting algorithm, whi...

2010
Tae-Kyun Kim Ignas Budvytis Roberto Cipolla

This paper presents a novel way to speed up the classification time of a boosting classifier. We make the shallow (flat) network deep (hierarchical) by growing a tree from the decision regions of a given boosting classifier. This provides many short paths for speeding up and preserves the reasonably smooth decision regions of the boosting classifier for good generalisation. We express the conve...

2015
Riccardo De Bin

Despite the limitations imposed by the proportional hazards assumption, the Cox model is probably the most popular statistical tool used to analyze survival data, thanks to its flexibility and ease of interpretation. For this reason, novel statistical/machine learning techniques are usually adapted to fit it, including boosting, an iterative technique originally developed in the machine learnin...

Journal: :Journal of Machine Learning Research 2004
Saharon Rosset Ji Zhu Trevor J. Hastie

In this paper we study boosting methods from a new perspective. We build on recent work by Efron et al. to show that boosting approximately (and in some cases exactly) minimizes its loss criterion with an l1 constraint on the coefficient vector. This helps understand the success of boosting with early stopping as regularized fitting of the loss criterion. For the two most commonly used criteria...

2012
Shang-Tse Chen Hsuan-Tien Lin Chi-Jen Lu

We study the task of online boosting — combining online weak learners into an online strong learner. While batch boosting has a sound theoretical foundation, online boosting deserves more study from the theoretical perspective. In this paper, we carefully compare the differences between online and batch boosting, and propose a novel and reasonable assumption for the online weak learner. Based o...

2009
Dong-Sheng Cao Qing-Song Xu Yi-Zeng Liang Liang-Xiao Zhang Hong-Dong Li

a r t i c l e i n f o The idea of boosting deeply roots in our daily life practice, which constructs the general aspects of how to think about chemical problems and how to build chemical models. In mathematics, boosting is an iterative reweighting procedure by sequentially applying a base learner to reweighted versions of the training data whose current weights are modified based on how accurat...

2001
Günther Eibl Karl Peter Pfeiffer

In simulation studies boosting algorithms seem to be susceptible to noise. This article applies Ada.Boost.M2 used with decision stumps to the digit recognition example, a simulated data set with attribute noise. Although the final model is both simple and complex enough, boosting fails to reach the Bayes error. A detailed analysis shows some characteristics of the boosting trials which influenc...

Journal: :Electronic Colloquium on Computational Complexity (ECCC) 2007
Satyen Kale

We revisit the connection between boosting algorithms and hard-core set constructions discovered by Klivans and Servedio. We present a boosting algorithm with a certain smoothness property that is necessary for hard-core set constructions: the distributions it generates do not put too much weight on any single example. We then use this boosting algorithm to show the existence of hard-core sets ...

2003
Peter Bühlmann

We present an extended abstract about boosting. We describe first in section 1 (in a self-contained way) a generic functional gradient descent algorithm, which yields a general representation of boosting. Properties of boosting or functional gradient descent are then very briefly summarized in section 2.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید