نتایج جستجو برای: مدل bagging

تعداد نتایج: 122039  

2012
Prasanna Kumari

-Classification is one of the data mining techniques that analyses a given data set and induces a model for each class based on their features present in the data. Bagging and boosting are heuristic approaches to develop classification models. These techniques generate a diverse ensemble of classifiers by manipulating the training data given to a base learning algorithm. They are very successfu...

2007
Ming-Fang Weng Chun-Kang Chen Yi-Hsuan Yang Rong-En Fan Yu-Ting Hsieh Yung-Yu Chuang Winston H. Hsu Chih-Jen Lin

In TRECVID 2007 high-level feature (HLF) detection, we extend the well-known LIBSVM and develop a toolkit specifically for HLF detection. The package shortens the learning time and provides a framework for researchers to easily conduct experiments. We efficiently and effectively aggregate detectors of training past data to achieve better performances. We propose post-processing techniques, conc...

Journal: :Journal of Statistical Planning and Inference 2007

2007
Frédéric RATLE Devis TUIA

This paper investigates the use of ensemble of predictors in order to improve the performance of spatial prediction methods. Support vector regression (SVR), a popular method from the field of statistical machine learning, is used. Several instances of SVR are combined using different data sampling schemes (bagging and boosting). Bagging shows good performance, and proves to be more computation...

2016
Andreas Buja Werner Stuetzle

Bagging is a device intended for reducing the prediction error of learning algorithms. In its simplest form, bagging draws bootstrap samples from the training sample, applies the learning algorithm to each bootstrap sample, and then averages the resulting prediction rules. We extend the definition of bagging from statistics to statistical functionals and study the von Mises expansion of bagged ...

2015
Gianni Franchi Jesús Angulo

The stochastic watershed is a probabilistic segmentation approach which estimates the probability density of contours of the image from a given gradient. In complex images, the stochastic watershed can enhance insignificant contours. To partially address this drawback, we introduce here a fully unsupervised multi-scale approach including bagging. Re-sampling and bagging is a classical stochasti...

Journal: :Pattern Recognition 2010
Gonzalo Martínez-Muñoz Alberto Suárez

The performance of m-out-of-n bagging with and without replacement in terms of the sampling ratio (m/n) is analyzed. Standard bagging uses resampling with replacement to generate bootstrap samples of equal size as the original training set mwor = n. Without-replacement methods typically use half samples mwr = n/2. These choices of sampling sizes are arbitrary and need not be optimal in terms of...

2006
Zhuo Zheng

Boosting and bagging are two techniques for improving the performance of learning algorithms. Both techniques have been successfully used in machine learning to improve the performance of classification algorithms such as decision trees, neural networks. In this paper, we focus on the use of feedforward back propagation neural networks for time series classification problems. We apply boosting ...

2014
Maryam Sabzevari Gonzalo Martínez-Muñoz Alberto Suárez

Bagging is a simple and robust classification algorithm in the presence of class label noise. This algorithm builds an ensemble of classifiers by bootstrapping samples with replacement of size equal to the original training set. However, several studies have shown that this choice of sampling size is arbitrary in terms of generalization performance of the ensemble. In this study we discuss how ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید