نتایج جستجو برای: vacuum bagging

تعداد نتایج: 49193  

1999
Richard Maclin David Opitz

An ensemble consists of a set of independently trained classifiers (such as neural networks or decision trees) whose predictions are combined when classifying novel instances. Previous research has shown that an ensemble as a whole is often more accurate than any of the single classifiers in the ensemble. Bagging (Breiman 1996a) and Boosting (F’reund & Schapire 1996) are two relatively new but ...

2005
Jan Macek Supphanat Kanokphara Anja Geumann

HMMs are the dominating technique used in speech recognition today since they perform well in overall phone recognition. In this paper, we show the comparison of HMM methods and machine learning techniques, such as neural networks, decision trees and ensemble classifiers with boosting and bagging in the task of articulatory-acoustic feature classification. The experimental results show that HMM...

2014
Wenge Rong Yifan Nie Yuanxin Ouyang Baolin Peng Zhang Xiong

Sentiment analysis has long been a hot topic for understanding users statements online. Previously many machine learning approaches for sentiment analysis such as simple feature-oriented SVM or more complicated probabilistic models have been proposed. Though they have demonstrated capability in polarity detection, there exist one challenge called the curse of dimensionality due to the high dime...

2000
Yves Grandvalet

Bagging is a procedure averaging estimators trained on bootstrap samples. Numerous experiments have shown that bagged estimates often yield better results than the original predictor, and several explanations have been given to account for this gain. However, six years from its introduction, bagging is still not fully understood. Most explanations given until now are based on global properties ...

2012
Meishan Zhang Wanxiang Che Ting Liu

We describe our method of traditional Phrase Structure Grammar (PSG) parsing in CIPS-Bakeoff2012 Task3. First, bagging is proposed to enhance the baseline performance of PSG parsing. Then we suggest exploiting another TreeBank (CTB7.0) to improve the performance further. Experimental results on the development data set demonstrate that bagging can boost the baseline F1 score from 81.33% to 84.4...

1996
Leo Breiman

Bagging predictors is a method for generating multiple versions of a predictor and using these to get an aggregated predictor. The aggregation averages over the versions when predicting a numerical outcome and does a plurality vote when predicting a class. The multiple versions are formed by making bootstrap replicates of the learning set and using these as new learning sets. Tests on real and ...

2000
Andreas Buja Werner Stuetzle

Bagging is a device intended for reducing the prediction error of learning algorithms. In its simplest form, bagging draws bootstrap samples from the training sample, applies the learning algorithm to each bootstrap sample, and then averages the resulting prediction rules. We study the von Mises expansion of a bagged statistical functional and show that it is related to the Stein-Efron ANOVA ex...

1996
J. Ross Quinlan

Breiman's bagging and Freund and Schapire's boosting are recent methods for improving the predictive power of classiier learning systems. Both form a set of classiiers that are combined by voting, bagging by generating replicated boot-strap samples of the data, and boosting by adjusting the weights of training instances. This paper reports results of applying both techniques to a system that le...

2011
Quan Sun Bernhard Pfahringer

Ensemble selection has recently appeared as a popular ensemble learning method, not only because its implementation is fairly straightforward, but also due to its excellent predictive performance on practical problems. The method has been highlighted in winning solutions of many data mining competitions, such as the Netflix competition, the KDD Cup 2009 and 2010, the UCSD FICO contest 2010, and...

1998
Bruce A. Draper Kyungim Baek

Previous research has shown that aggregated predictors improve the performance of non-parametric function approximation techniques. This paper presents the results of applying aggregated predictors to a computer vision problem, and shows that the method of bagging signi cantly improves performance. In fact, the results are better than those previously reported on other domains. This paper expla...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید