نتایج جستجو برای: مدل bagging
تعداد نتایج: 122039 فیلتر نتایج به سال:
Imbalanced class problems appear in many real applications of classification learning. We propose a novel sampling method to improve bagging for data sets with skewed class distributions. In our new sampling method “Roughly Balanced Bagging” (RB Bagging), the number of samples in the largest and smallest classes are different, but they are effectively balanced when averaged over all subsets, wh...
We investigate machine learning techniques for coping with highly skewed class distributions in two spontaneous speech processing tasks. Both tasks, sentence boundary and disfluency detection, provide important structural information for downstream language processing modules. We examine the effect of data set size, task, sampling method (no sampling, downsampling, oversampling, and ensemble sa...
Two learning ensemble methods, Bagging and Boosting, have been applied to decision trees to improve classification accuracy over that of a single decision tree learner. We introduce Bagging and propose a variant of it — Improved Bagging — which, in general, outperforms the original bagging algorithm. We experiment on 22 datasets from the UCI repository, with emphasis on the ensemble’s accuracy ...
Gradient Boosting and bagging applied to regressors can reduce the error due to bias and variance respectively. Alternatively, Stochastic Gradient Boosting (SGB) and Iterated Bagging (IB) attempt to simultaneously reduce the contribution of both bias and variance to error. We provide an extensive empirical analysis of these methods, along with two alternate bias-variance reduction approaches — ...
Bagging has been found to be successful in increasing the predictive performance of unstable classifiers. Bagging draws bootstrap samples from the training sample, applies the classifier to each bootstrap sample, and then averages over all obtained classification rules. The idea of trimmed bagging is to exclude the bootstrapped classification rules that yield the highest error rates, as estimat...
Bagging and boosting are two general techniques for building predictors based on small samples from a dataset. We show that boosting can be parallelized, and then present performance results for parallelized bagging and boosting using OC1 decision trees and two standard datasets. The main results are that sample sizes limit achievable accuracy, regardless of computational time spent; that paral...
background & aim: in many medical studies, one data set is used to construct the model, and to test its performance. this approach is prone to over optimization, and leads to statistics with low chance of external validity. data splitting can be used to create training and test sets but the cost is reduction in power. the aim of this study was to demonstrate the ability of bootstrap aggregating...
Internal egg hatching in Caenorhabditis elegans, "worm bagging," is induced by exposure to bacteria. This study demonstrates that the determination of worm bagging frequency allows for advanced insight into the degree of bacterial pathogenicity and is highly predictive of the survival of worm populations. Therefore, worm bagging frequency can be regarded as a reliable population-wide stress rep...
Two bagging approaches, say 1 2 n-out-of-n without replacement (subagging) and n-out-of-n with replacement (bagging) have been applied in the problem of estimation of the parameters in a multivariate mixture model. It has been observed by Monte Carlo simulations and a real data example, that both bagging methods have improved the standard deviation of the maximum likelihood estimator of the mix...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید