نتایج جستجو برای: مدل bagging
تعداد نتایج: 122039 فیلتر نتایج به سال:
Bagging forms a committee of classijiers by bootstrap aggregation of training sets from a pool of training data. A simple alternative to bagging is to partition the data into disjoint subsets. Experiments on various datasets show that, given the same size partitions and bags, disjoint partitions result in betterperformance than bootstrap aggregates (bags). Many applications (e.g., protein struc...
The incidence of Dupuytren's contracture in a polyvinyl chloride (PVC) manufacturing plant, where a great deal of bagging and packing took place by hand, was higher than in another plant in which there was no bagging or packing. The incidence in the packing plant was double that found in an earlier survey by Early at Crewe Locomotive Works of 4801 individuals, most of whom were manual workers. ...
Aim at solving the existing problems of 3D model retrieval based on neural network, this paper proposes a new algorithm based on BP-bagging. Through bagging, the algorithm turns the weak classifier into the strong. As to feature extraction, the algorithm projections 3D model into six 2D images by six perspective points. Then transforms the images into frequency domain, gets the high dimension f...
In the paper the investigation of m-out-of-n bagging with and without replacement using genetic neural networks is presented. The study was conducted with a newly developed system in Matlab to generate and test hybrid and multiple models of computational intelligence using different resampling methods. All experiments were conducted with real-world data derived from a cadastral system and regis...
This paper proposes an approach to improve statistical word alignment with ensemble methods. Two ensemble methods are investigated: bagging and cross-validation committees. On these two methods, both weighted voting and unweighted voting are compared under the word alignment task. In addition, we analyze the effect of different sizes of training sets on the bagging method. Experimental results ...
Bagging is a method of obtaining more robust predictions when the model class under consideration is unstable with respect to the data, i.e., small changes in the data can cause the predicted values to change significantly. In this paper, we introduce a Bayesian version of bagging based on the Bayesian bootstrap. The Bayesian bootstrap resolves a theoretical problem with ordinary bagging and of...
Bagging, boosting and random subspace methods are well known re-sampling ensemble methods that generate and combine a diversity of learners using the same learning algorithm for the base-regressor. In this work, we built an ensemble of bagging, boosting and random subspace methods ensembles with 8 sub-regressors in each one and then an averaging methodology is used for the final prediction. We ...
Combining machine learning models is a means of improving overall accuracy. Various algorithms have been proposed to create aggregate models from other models, and two popular examples for classification are Bagging and AdaBoost. In this paper we examine their adaptation to regression, and benchmark them on synthetic and real-world data. Our experiments reveal that different types of AdaBoost a...
This paper describes a set of experiments with bagging – a method, which can improve results of classification algorithms. Our use of this method aims at classification algorithms generating decision trees. Results of performance tests focused on the use of the bagging method on binary decision trees are presented. The minimum number of decision trees, which enables an improvement of the classi...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید