نتایج جستجو برای: boosting
تعداد نتایج: 14818 فیلتر نتایج به سال:
Combining multiple global models (e.g. back-propagation based neural networks) is an effective technique for improving classification accuracy. This technique reduces variance by manipulating the distribution of the training data. In many large scale data analysis problems involving heterogeneous databases with attribute instability, standard boosting methods can be improved by coalescing multi...
Many learning tasks for computer vision problems can be described by multiple views or multiple features. These views can be exploited in order to learn from unlabeled data, a.k.a. “multi-view learning”. In these methods, usually the classifiers iteratively label each other a subset of the unlabeled data and ignore the rest. In this work, we propose a new multi-view boosting algorithm that, unl...
The paper extends the notion of linear programming boosting to handle uneven datasets. Extensive experiments with text classification problem compare the performance of a number of different boosting strategies, concentrating on the problems posed by uneven datasets.
Real Adaboost is a well-known and good performance boosting method used to build machine ensembles for classification. Considering that its emphasis function can be decomposed in two factors that pay separated attention to sample errors and to their proximity to the classification border, a generalized emphasis function that combines both components by means of a selectable parameter, l, is pre...
Abstract: This paper investigates the trajectory prediction and dispersion for unguided fin stabilized artillery rocket in order to explain the importance of the rocket production accuracy and the benefit of using guided rockets. The total dispersion results mainly from three effects. The first is the dispersion due to rocket production inaccuracy, which includes propellant mass, composition in...
Boosting is known to be sensitive to label noise. We studied two approaches to improve AdaBoost’s robustness against labelling errors. One is to employ a label-noise robust classifier as a base learner, while the other is to modify the AdaBoost algorithm to be more robust. Empirical evaluation shows that a committee of robust classifiers, although converges faster than non label-noise aware Ada...
Methods that create several classifiers out of one base classifier, so-called ensemble creation methods, have been proposed and successfully applied to many classification problems recently. One category of such methods is Boosting with AdaBoost being the best known procedure belonging to this category. Boosting algorithms were first developed for two-class problems, but then extended to deal w...
Blind quantitative steganalysis is about revealing more details about hidden information without any prior knowledge of steganograghy. Machine learning can be used to estimate some properties of hidden message for blind quantitative steganalysis. We propose a quantitative steganalysis method based on fusion of different steganalysis features and the estimator relies on gradient boosting. Experi...
This paper calculates the general form of a 5d metric when fundamental string and momentum charges are added. This is accomplished using the standard method of boosting and T-dualising a solution to Einstein’s equations, where the solution has three Killing vectors and is expressed in a generic form. The thermodynamical properties of the charged solution are derived and the physical implication...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید