نتایج جستجو برای: boosting

تعداد نتایج: 14818  

Journal: :Speech Communication 2008
Junichi Yamagishi Hisashi Kawai Takao Kobayashi

2000
Carlos Domingo Osamu Watanabe

We propse a new boosting algorithm that mends some of the problems that have been detected in the so far most successful boosting algorithm, AdaBoost due to Freund and Schapire [FS97]. These problems are: (1) AdaBoost cannot be used in the boosting by filtering framework, and (2) AdaBoost does not seem to be noise resistant. In order to solve them, we propose a new boosting algorithm MadaBoost ...

2010
Adam Craig Pocock Paraskevas Yiapanis Jeremy Singer Mikel Luján Gavin Brown

Oza’s Online Boosting algorithm provides a version of AdaBoost which can be trained in an online way for stationary problems. One perspective is that this enables the power of the boosting framework to be applied to datasets which are too large to fit into memory. The online boosting algorithm assumes the data distribution to be independent and identically distributed (i.i.d.) and therefore has...

Journal: :Expert Syst. Appl. 2014
Gang Wang Jian Ma Shanlin Yang

With the recent financial crisis and European debt crisis, corporate bankruptcy prediction has become an increasingly important issue for financial institutions. Many statistical and intelligent methods have been proposed, however, there is no overall best method has been used in predicting corporate bankruptcy. Recent studies suggest ensemble learning methods may have potential applicability i...

2007
Lawrence O. Hall Robert E. Banfield Kevin W. Bowyer W. Philip Kegelmeyer

In this paper, we examine ensemble algorithms (Boosting Lite and Ivoting) that provide accuracy approximating a single classifier, but which require significantly fewer training examples. Such algorithms allow ensemble methods to operate on very large data sets or use very slow learning algorithms. Boosting Lite is compared with Ivoting, standard boosting, and building a single classifier. Comp...

2017
Alan Mosca George D. Magoulas

In this paper we present a new ensemble method, called Boosted Residual Networks, which builds an ensemble of Residual Networks by growing the member network at each round of boosting. The proposed approach combines recent developements in Residual Networks a method for creating very deep networks by including a shortcut layer between different groups of layers with the Deep Incremental Boostin...

Journal: :Statistics and Computing 2016
Eugene Dubossarsky Jerome H. Friedman John T. Ormerod Matthew P. Wand

A new data science tool named wavelet-based gradient boosting is proposed and tested. The approach is special case of componentwise linear least squares gradient boosting, and involves wavelet functions of the original predictors.Wavelet-based gradient boosting takes advantages of the approximate 1 penalization induced by gradient boosting to give appropriate penalized additive fits. The method...

2012
Sakrapee Paisitkriangkrai Chunhua Shen Anton van den Hengel

In this document we provide a complete derivation for multi-class boosting with group sparsity and a full explanation of admm algorithm presented in the main paper. 1 Multi-class boosting with group sparsity We first provide the derivation for multi-class logistic loss with 1,2-norm. We then show the difference between our boosting with 1,2-norm and 1,∞-norm. We then briefly discuss our group s...

2009
Zhi-Hua Zhou

Boosting is a kind of ensemble methods which produce a strong learner that is capable of making very accurate predictions by combining rough and moderately inaccurate learners (which are called as base learners or weak learners). In particular, Boosting sequentially trains a series of base learners by using a base learning algorithm, where the training examples wrongly predicted by a base learn...

2014
Tofigh Naghibi Beat Pfister

By exploiting the duality between boosting and online learning, we present a boosting framework which proves to be extremely powerful thanks to employing the vast knowledge available in the online learning area. Using this framework, we develop various algorithms to address multiple practically and theoretically interesting questions including sparse boosting, smooth-distribution boosting, agno...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید