نتایج جستجو برای: logitboost

تعداد نتایج: 116  

Journal: :CoRR 2009
Ping Li

We develop abc-logitboost, based on the prior work on abc-boost[10] and robust logitboost[11]. Our extensive experiments on a variety of datasets demonstrate the considerable improvement of abc-logitboost over logitboost and abc-mart.

2010
Ping Li

Logitboost is an influential boosting algorithm for classification. In this paper, we develop robust logitboost to provide an explicit formulation of tree-split criterion for building weak learners (regression trees) for logitboost. This formulation leads to a numerically stable implementation of logitboost. We then propose abc-logitboost for multi-class classification, by combining robust logi...

Journal: :CoRR 2010
Ping Li

This empirical study is mainly devoted to comparing four tree-based boosting algorithms: mart, abc-mart, robust logitboost, and abc-logitboost, for multi-class classification on a variety of publicly available datasets. Some of those datasets have been thoroughly tested in prior studies using a broad range of classification algorithms including SVM, neural nets, and deep learning. In terms of t...

Journal: :Soft Comput. 2006
José Otero Luciano Sánchez

Recently, Adaboost has been compared to greedy backfitting of extended additive models in logistic regression problems, or “Logitboost". The Adaboost algorithm has been applied to learn fuzzy rules in classification problems, and other backfitting algorithms to learn fuzzy rules in modeling problems but, up to our knowledge, there are not previous works that extend the Logitboost algorithm to l...

Journal: :Computational Statistics & Data Analysis 2017

Journal: :CoRR 2012
Peng Sun Mark D. Reid Jie Zhou

This paper presents an improvement to model learning when using multi-class LogitBoost for classification. Motivated by the statistical view, LogitBoost can be seen as additive tree regression. Two important factors in this setting are: 1) coupled classifier output due to a sum-to-zero constraint, and 2) the dense Hessian matrices that arise when computing tree node split gain and node value fi...

Journal: :Biochemical and biophysical research communications 2005
Kai-Yan Feng Yu-Dong Cai Kuo-Chen Chou

A novel classifier, the so-called "LogitBoost" classifier, was introduced to predict the structural class of a protein domain according to its amino acid sequence. LogitBoost is featured by introducing a log-likelihood loss function to reduce the sensitivity to noise and outliers, as well as by performing classification via combining many weak classifiers together to build up a very strong and ...

Journal: :Computational Statistics & Data Analysis 2017
Marc Goessling

Multivariate binary distributions can be decomposed into products of univariate conditional distributions. Recently popular approaches have modeled these conditionals through neural networks with sophisticated weight-sharing structures. It is shown that state-of-the-art performance on several standard benchmark datasets can actually be achieved by training separate probability estimators for ea...

2014
Peng Sun Tong Zhang Jie Zhou

LogitBoost, MART and their variant can be viewed as additive tree regression using logistic loss and boosting style optimization. We analyze their convergence rates based on a new weak learnability formulation. We show that it has O( 1 T ) rate when using gradient descent only, while a linear rate is achieved when using Newton descent. Moreover, introducing Newton descent when growing the trees...

Journal: :Journal of theoretical biology 2006
Yu-Dong Cai Kai-Yan Feng Wen-Cong Lu Kuo-Chen Chou

Prediction of protein classification is an important topic in molecular biology. This is because it is able to not only provide useful information from the viewpoint of structure itself, but also greatly stimulate the characterization of many other features of proteins that may be closely correlated with their biological functions. In this paper, the LogitBoost, one of the boosting algorithms d...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید