نتایج جستجو برای: bootstrap aggregating
تعداد نتایج: 18325 فیلتر نتایج به سال:
Domain adaptation plays an important role in multi-domain SMT. Conventional approaches usually resort to statistical classifiers, but they require annotated monolingual data in different domains, which may not be available in some cases. We instead propose a simple but effective bagging-based approach without using any annotated data. Large-scale experiments show that our new method improves tr...
In recent years, a number of works proposing the combination of multiple classifiers to produce a single classification have been reported in remote sensing literature. The resulting classifier, referred to as an ensemble classifier, is generally found to be more accurate than any of the individual classifiers making up the ensemble. As accuracy is the primary concern, much of the research in t...
The class imbalance problems have been reported to severely hinder classification performance of many standard learning algorithms, and have attracted a great deal of attention from researchers of different fields. Therefore, a number of methods, such as sampling methods, cost-sensitive learning methods, and bagging and boosting based ensemble methods, have been proposed to solve these problems...
Ensemble methods allow to improve the accuracy of classification methods. This work considers the application of one of these methods, named Rotation-based, when the classifiers to combine are RBF Networks. This ensemble method, for each member of the ensemble, transforms the data set using a pseudo-random rotation of the axis. Then the classifier is constructed using this rotation data. The re...
In this paper we propose a novel approach for ensemble construction based on the use of nonlinear projections to achieve both accuracy and diversity of individual classifiers. The proposed approach combines the philosophy of boosting, putting more effort on difficult instances, with the basis of the random subspace method. Our main contribution is that instead of using a random subspace, we con...
Nowadays, classifier combination methodsreceives great attention from machine learning researchers. It is a powerful tool to improve the accuracy of classifiers. This approach has become increasingly interesting, especially for real-world problems, which are often characterized by their imbalanced nature. The unbalanced distribution of data leads to poor performance of most of the conventional ...
Theoretical and experimental analyses of bagging indicate that it is primarily a variance reduction technique. This suggests that bagging should be applied to learning algorithms tuned to minimize bias, even at the cost of some increase in variance. We test this idea with Support Vector Machines (SVMs) by employing out-of-bag estimates of bias and variance to tune the SVMs. Experiments indicate...
Ensemble classification methods have been shown to produce more accurate predictions than the base component models (Bauer and Kohavi 1999). Due to their effectiveness, ensemble approaches have been applied in a wide range of domains to improve classification. The expected prediction error of classification models can be decomposed into bias and variance (Friedman 1997). Ensemble methods that i...
This paper proposes the application of bagging to obtain more robust and accurate predictions using Gaussian process regression models. The training data is re-sampled using the bootstrap method to form several training sets, from which multiple Gaussian process models are developed and combined through weighting to provide predictions. A number of weighting methods for model combination are di...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید