نتایج جستجو برای: boosting

تعداد نتایج: 14818  

Journal: :Journal of High Energy Physics 2022

A bstract Bondi-Metzner-Sachs (BMS) symmetries, or equivalently Conformal Carroll are intrinsically associated to null manifolds and in two dimensions can be obtained as an Inönü-Wigner contraction of the two-dimensional (2 d ) relativistic conformal algebra. Instead performing contractions, we demonstrate this paper how transmutation symmetries achieved by infinite boosts degenerate linear tra...

2010
Vitaly Feldman

We consider the problem of boosting the accuracy of weak learning algorithms in the agnostic learning framework of Haussler (1992) and Kearns et al. (1992). Known algorithms for this problem (BenDavid et al., 2001; Gavinsky, 2002; Kalai et al. , 2008) follow the same strategy as boosting algorithms in the PAC model: the weak learner is executed on the same target function but over different dis...

Journal: :Neural Computation 2000

Journal: :Journal of Machine Learning Research 2010
Indraneel Mukherjee Robert E. Schapire

Boosting combines weak classifiers to form highly accurate predictors. Although the case of binary classification is well understood, in the multiclass setting, the “correct” requirements on the weak classifier, or the notion of the most efficient boosting algorithms are missing. In this paper, we create a broad and general framework, within which we make precise and identify the optimal requir...

2002
Gábor Lugosi Nicolas Vayatis

The probability of error of classification methods based on convex combinations of simple base classifiers by “boosting” algorithms is investigated. We show in this talk that certain regularized boosting algorithms provide Bayes-risk consistent classifiers under the only assumption that the Bayes classifier may be approximated by a convex combination of the base classifiers. Non-asymptotic dist...

2003
Rong Zhang Alexander I. Rudnicky

This paper compares the performance of Boosting and nonBoosting training algorithms in large vocabulary continuous speech recognition (LVCSR) using ensembles of acoustic models. Both algorithms demonstrated significant word error rate reduction on the CMU Communicator corpus. However, both algorithms produced comparable improvements, even though one would expect that the Boosting algorithm, whi...

2002
Osamu Watanabe

We discuss algorithmic aspects of boosting techniques, such as Majority Vote Boosting [Fre95], AdaBoost [FS97], and MadaBoost [DW00a]. Considering a situation where we are given a huge amount of examples and asked to find some rule for explaining these example data, we show some reasonable algorithmic approaches for dealing with such a huge dataset by boosting techniques. Through this example, ...

2007
Yonatan Amit Ofer Dekel Yoram Singer

We describe, analyze and experiment with a boosting algorithm for multilabel categorization problems. Our algorithm includes as special cases previously studied boosting algorithms such as Adaboost.MH. We cast the multilabel problem as multiple binary decision problems, based on a user-defined covering of the set of labels. We prove a lower bound on the progress made by our algorithm on each bo...

2002
Saharon Rosset Eran Segal

Several authors have suggested viewing boosting as a gradient descent search for a good fit in function space. We apply gradient-based boosting methodology to the unsupervised learning problem of density estimation. We show convergence properties of the algorithm and prove that a strength of weak learnability property applies to this problem as well. We illustrate the potential of this approach...

1997
Steve Waterhouse Gary Cook

In this paper we investigate a number of ensemble methods for improving the performance of phoneme classiication for use in a speech recognition system. We discuss boosting and mixtures of experts, both in isolation and in combination. We present results on an isolated word database. The results show that principled ensemble methods such as boosting and mixtures provide superior performance to ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید