نتایج جستجو برای: bootstrap aggregating

تعداد نتایج: 18325  

2006
Ricardo Ñanculef Carlos Valle Héctor Allende Claudio Moraga

This paper deals with a learning algorithm which combines two well known methods to generate ensemble diversity error negative correlation and resampling. In this algorithm, a set of learners iteratively and synchronously improve their state considering information about the performance of a fixed number of other learners in the ensemble, to generate a sort of local negative correlation. Resamp...

2007
K. Ming Leung

Methods, such as holdout, random subsampling, k-fold cross-validation, and bootstrap, for making error estimation are discussed. Also considered are general techniques, such as bagging and boosting, for increasing model accuracy. Directory • Table of

2009
Frank Emmert-Streib Matthias Dehmer

In this paper we present an algorithm that allows to select the input variables of Boolean networks from incomplete data. More precisely, sets of input variables, instead of single variables, are evaluated using mutual information to find the combination that maximizes the mutual information of input and output variables. To account for the incompleteness of the data bootstrap aggregation is us...

2013
Bee Wah Yap Khatijahhusna Abd Rani Hezlin Aryani Abd Rahman Simon Fong Zuraida Khairudin Nik Nik Abdullah

Most classifiers work well when the class distribution in the response variable of the dataset is well balanced. Problems arise when the dataset is imbalanced. This paper applied four methods: Oversampling, Undersampling, Bagging and Boosting in handling imbalanced datasets. The cardiac surgery dataset has a binary response variable (1=Died, 0=Alive). The sample size is 4976 cases with 4.2% (Di...

2000
Yves Grandvalet

Bagging is a procedure averaging estimators trained on bootstrap samples. Numerous experiments have shown that bagged estimates often yield better results than the original predictor, and several explanations have been given to account for this gain. However, six years from its introduction, bagging is still not fully understood. Most explanations given until now are based on global properties ...

2006
Juan José Rodríguez Diez Jesús Maudes

Grafted trees are trees that are constructed using two methods. The first method creates an initial tree, while the second method is used to complete the tree. In this work, the first classifier is an unpruned tree from a 10% sample of the training data. Grafting is a method for constructing ensembles of decision trees, where each tree is a grafted tree. Grafting by itself is better than Baggin...

2016
Efstratios Sygkounas Giuseppe Rizzo Raphaël Troncy

We performed a thorough replicate study of the top performing systems in the yearly SemEval Twitter Sentiment Analysis task. We highlight and discuss differences among the results obtained by those systems that have been officially published and the ones we are able to compute. Learning from the studies being made on the systems, we also propose SentiME, an ensemble system composed of five stat...

Journal: :Expert Syst. Appl. 2011
Gang Wang Jian Ma

0957-4174/$ see front matter 2011 Elsevier Ltd. A doi:10.1016/j.eswa.2011.04.191 ⇑ Corresponding author at: School of Management, ogy, Hefei, Anhui 230009, PR China. Tel.: +852 9799 0 E-mail address: [email protected] (G. Wang). With the rapid growth and increased competition in credit industry, the corporate credit risk prediction is becoming more important for credit-granting institutions. ...

2012
Roberto Naranjo Laurent Besacier Tulio Rojas Egidio Marsico

Nasa Yuwe is an indigenous language from Colombia (South America), it is, to some extent, an endangered language. Different efforts have been done to revitalize it, the most important of which being the unification of the Nasa Yuwe alphabet. The Nasa Yuwe vowel system has 32 vowels contrasting in nasalization, length, aspiration and glottalization, causing great confusion for the learner. In or...

2000
Marina Skurichina Robert P. W. Duin

To improve weak classifiers bagging and boosting could be used. These techniques are based on combining classifiers. Usually, a simple majority vote or a weighted majority vote are used as combining rules in bagging and boosting. However, other combining rules such as mean, product and average are possible. In this paper, we study bagging and boosting in Linear Discriminant Analysis (LDA) and t...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید