نتایج جستجو برای: and boosting

تعداد نتایج: 16829190  

2005
Alexandru Niculescu-Mizil Rich Caruana

Boosted decision trees typically yield good accuracy, precision, and ROC area. However, because the outputs from boosting are not well calibrated posterior probabilities, boosting yields poor squared error and cross-entropy. We empirically demonstrate why AdaBoost predicts distorted probabilities and examine three calibration methods for correcting this distortion: Platt Scaling, Isotonic Regre...

2017
A. Mayr

Objectives: Component-wise boosting algorithms have evolved into a popular estimation scheme in biomedical regression settings. The iteration number of these algorithms is the most important tuning parameter to optimize their performance. To date, no fully automated strategy for determining the optimal stopping iteration of boosting algorithms has been proposed. Methods: We propose a fully data...

1997
Harris Drucker

In the regression context, boosting and bagging are techniques to build a committee of regressors that may be superior to a single regressor. We use regression trees as fundamental building blocks in bagging committee machines and boosting committee machines. Performance is analyzed on three non-linear functions and the Boston housing database. In all cases, boosting is at least equivalent, and...

2007
Sara Solla

We study Schapire's Boosting Algorithm(SBA) for use in practice. SBA is analyzed in terms of its representation and its search. We show that the SBA representation is a piecewise tiling of the domain and that if the weak learner has low coverage ability, SBA's search may fail to boost or may give a sub-optimal solution. We present a rejection boosting algorithm that trades-oo exploration and ex...

2003
Tong Zhang Bin Yu

Boosting is one of the most significant advances in machine learning for classification and regression. In its original and computationally flexible version, boosting seeks to minimize empirically a loss function in a greedy fashion. The resulted estimator takes an additive function form and is built iteratively by applying a base estimator (or learner) to updated samples depending on the previ...

2014
Kehan Gao Taghi M. Khoshgoftaar Amri Napolitano

High dimensionality and class imbalance are two main problems that affect the quality of training datasets in software defect prediction, resulting in inefficient classification models. Feature selection and data sampling are often used to overcome these problems. Feature selection is a process of choosing the most important attributes from the original data set. Data sampling alters the data s...

The Dynamic Voltage Restorer (DVR) is a commercially available, popular device to eliminate voltage sags and swells in the distribution lines. Its basic function is to inject the voltage difference (difference between the pre-sag and sag voltage) to the power line and maintains the pre-sag voltage condition in the load side. The efficiency of the DVR depends on the performance of the efficiency...

2013
Jungeun Kwon Keunho Choi Yongmoo Suh

Several rating agencies such as Standard & Poor's (S&P), Moody's and Fitch Ratings have evaluated firms’ credit rating. Since lots of fees are required by the agencies and sometimes the timely default risk of the firms is not reflected, it can be helpful for stakeholders if the credit ratings can be predicted before the agencies publish them. However, it is not easy to make an accurate predicti...

1998
Zijian Zheng

Boosting and Bagging, as two representative approaches to learning classiier committees, have demonstrated great success, especially for decision tree learning. They repeatedly build diierent classiiers using a base learning algorithm by changing the distribution of the training set. Sasc, as a diierent type of committee learning method, can also signiicantly reduce the error rate of decision t...

2017
Anna Veronika Dorogush Vasily Ershov Andrey Gulin

In this paper we present CatBoost, a new open-sourced gradient boosting library that successfully handles categorical features and outperforms existing publicly available implementations of gradient boosting in terms of quality on a set of popular publicly available datasets. The library has a GPU implementation of learning algorithm and a CPU implementation of scoring algorithm, which are sign...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید