نتایج جستجو برای: vacuum bagging

تعداد نتایج: 49193  

2011
Joaquín Torres-Sospedra Carlos Hernández-Espinosa Mercedes Fernández-Redondo

In previous researches it can been seen that Bagging, Boosting and Cross-Validation Committee can provide good performance separately. In this paper, Boosting methods are mixed with Bagging and Cross-Validation Committee in order to generate accurate ensembles and take benefit from all these alternatives. In this way, the networks are trained according to the boosting methods but the specific t...

2016
Sung-Hwan Min

Ensemble classification combines individually trained classifiers to obtain more accurate predictions than individual classifiers alone. Ensemble techniques are very useful for improving the generalizability of the classifier. Bagging is the method used most commonly for constructing ensemble classifiers. In bagging, different training data subsets are drawn randomly with replacement from the o...

Journal: :JIPS 2014
Deepak Ghimire Joonwhoan Lee

An extreme learning machine (ELM) is a recently proposed learning algorithm for a single-layer feed forward neural network. In this paper we studied the ensemble of ELM by using a bagging algorithm for facial expression recognition (FER). Facial expression analysis is widely used in the behavior interpretation of emotions, for cognitive science, and social interactions. This paper presents a me...

2002
Nitesh V. Chawla Thomas E. Moore Kevin W. Bowyer Philip Kegelmeyer

10 Bagging forms a committee of classifiers by bootstrap aggregation of training sets from a pool of training data. A 11 simple alternative to bagging is to partition the data into disjoint subsets. Experiments with decision tree and neural 12 network classifiers on various datasets show that, given the same size partitions and bags, disjoint partitions result in 13 performance equivalent to, o...

2014
Daniel Gianola Kent A. Weigel Nicole Krämer Alessandra Stella Chris-Carolin Schön

We examined whether or not the predictive ability of genomic best linear unbiased prediction (GBLUP) could be improved via a resampling method used in machine learning: bootstrap aggregating sampling ("bagging"). In theory, bagging can be useful when the predictor has large variance or when the number of markers is much larger than sample size, preventing effective regularization. After present...

1997
Kai Ming Ting Ian H. Witten

In this paper, we investigate the method of stacked generalization in combining models derived from diierent subsets of a training dataset by a single learning algorithm, as well as diierent algorithms. The simplest way to combine predictions from competing models is majority vote, and the eeect of the sampling regime used to generate training subsets has already been studied in this context|wh...

1998
Zijian Zheng

Classiier committee learning approaches have demonstrated great success in increasing the prediction accuracy of classiier learning , which is a key technique for datamining. These approaches generate several classiiers to form a committee by repeated application of a single base learning algorithm. The committee members vote to decide the nal classiication. It has been shown that Boosting and ...

Journal: :IJPRAI 2013
Santi Seguí Laura Igual Jordi Vitrià

The problem of training classifiers only with target data arises in many applications where non-target data are too costly, difficult to obtain, or not available at all. Several one-class classification methods have been presented to solve this problem, but most of the methods are highly sensitive to the presence of outliers in the target class. Ensemble methods have therefore been proposed as ...

2010
Ke Wang Tao Chen Raymond Lau

This paper presents the application of the bagging technique for non-linear regression models to obtain more accurate and robust calibration of spectroscopy. Bagging refers to the combination of multiple models obtained by bootstrap re-sampling with replacement into an ensemble model to reduce prediction errors. It is well suited to “non-robust” models, such as the non-linear calibration method...

2009
Tadeusz Lasota Zbigniew Telec Bogdan Trawinski Krzysztof Trawinski

The study reported was devoted to investigate to what extent bagging approach could lead to the improvement of the accuracy machine learning regression models. Four algorithms implemented in the KEEL tool, including two evolutionary fuzzy systems, decision trees for regression, and neural network, were used in the experiments. The results showed that some bagging ensembles ensured higher predic...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید