نتایج جستجو برای: مدل bagging

تعداد نتایج: 122039  

Journal: :Nature 2001

2010
Albert Bifet Geoff Holmes Bernhard Pfahringer

Bagging, boosting and Random Forests are classical ensemble methods used to improve the performance of single classifiers. They obtain superior performance by increasing the accuracy and diversity of the single classifiers. Attempts have been made to reproduce these methods in the more challenging context of evolving data streams. In this paper, we propose a new variant of bagging, called lever...

Journal: :Machine Learning 2004

Journal: :Neurocomputing 2015
Jerzy Blaszczynski Jerzy Stefanowski

Various approaches to extend bagging ensembles for class imbalanced data are considered. First, we review known extensions and compare them in a comprehensive experimental study. The results show that integrating bagging with under-sampling is more powerful than over-sampling. They also allow to distinguish Roughly Balanced Bagging as the most accurate extension. Then, we point out that complex...

Journal: :Pattern Recognition 1998
Marina Skurichina Robert P. W. Duin

Classifiers built on small training sets are usually biased or unstable. Different techniques exist to construct more stable classifiers. It is not clear which ones are good, and whether they really stabilize the classifier or just improve the performance. In this paper bagging (bootstrapping and aggregating (1)) is studied for a number of linear classifiers. A measure for the instability of cl...

Journal: :Comput. Sci. Inf. Syst. 2006
Kristína Machova Miroslav Puszta Frantisek Barcák Peter Bednár

In this paper we present an improvement of the precision of classification algorithm results. Two various approaches are known: bagging and boosting. This paper describes a set of experiments with bagging and boosting methods. Our use of these methods aims at classification algorithms generating decision trees. Results of performance tests focused on the use of the bagging and boosting methods ...

2011
Gang Wang Jian Ma ShanLin Yang S. L. YANG

Bagging is one of the older, simpler and better known ensemble methods. However, the bootstrap sampling strategy in bagging appears to lead to ensembles of low diversity and accuracy compared with other ensemble methods. In this paper, a new variant of bagging, named IGF-Bagging, is proposed. Firstly, this method obtains bootstrap instances. Then, it employs Information Gain (IG) based feature ...

2014
Germano Leão DemoLin Leite amanDa FiaLho José CoLa ZanunCio ronaLDo reis Júnior aLves Da Costa

Tomato borers, especially Tuta absoluta (Lepidoptera: Gelechiidae), a pest introduced in southern Europe, northern Africa and the Middle East, and diseases can damage tomato (Solanum lycopersicum) fruit. This study tested the economic and technical feasibility of bagging tomato fruits clusters during organic production to protect them against insects and diseases. The experiment was randomized ...

1997
Harris Drucker

In the regression context, boosting and bagging are techniques to build a committee of regressors that may be superior to a single regressor. We use regression trees as fundamental building blocks in bagging committee machines and boosting committee machines. Performance is analyzed on three non-linear functions and the Boston housing database. In all cases, boosting is at least equivalent, and...

Journal: :Neurocomputing 2010
André L. V. Coelho Diego Silveira Costa Nascimento

Bagging is a popular ensemble algorithm based on the idea of data resampling. In this paper, aiming at increasing the incurred levels of ensemble diversity, we present an evolutionary approach for optimally designing Bagging models composed of heterogeneous components. To assess its potentials, experiments with well-known learning algorithms and classification datasets are discussed whereby the...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید