Overriding the Experts: A Stacking Method for Combining Marginal Classifiers
نویسندگان
چکیده
The design of an optimal Bayesian classifier for multiple features is dependent on the estimation of multidimensional joint probability density functions and therefore requires a design sample size that increases exponentially with the number of dimensions. A method was developed that combines classifications from marginal density functions using an additional classifier. Unlike voting methods, this method can select a more appropriate class than the ones selected by the marginal classifiers, thus "overriding" their decisions. For two classes and two features, this method always demonstrates a probability of error no worse than the probability of error of the best marginal classifier.
منابع مشابه
Overriding the Experts: A Fusion Method for Combining Marginal Classifiers
The design of an optimal Bayesian classifier for multiple features is dependent on the estimation of multidimensional joint probability density functions and therefore requires a design sample size that increases exponentially with the number of dimensions. A method was developed that combines classification decisions from marginal density functions using an additional classifier. Unlike voting...
متن کاملCombined Binary Classifiers with Applica
Many applications require classification of examples into one of several classes. A common way of designing such classifiers is to determine the class based on the outputs of several binary classifiers. We consider some of the most popular methods for combining the decisions of the binary classifiers, and improve existing bounds on the error rates of the combined classifier over the training se...
متن کاملCombined binary classifiers with applications to speech recognition
Many applications require classification of examples into one of several classes. A common way of designing such classifiers is to determine the class based on the outputs of several binary classifiers. We consider some of the most popular methods for combining the decisions of the binary classifiers, and improve existing bounds on the error rates of the combined classifier over the training se...
متن کاملTroika - An improved stacking schema for classification tasks
The idea of ensemble methodology is to build a predictive model by integrating multiple models. It is well-known that ensemble methods can be used for improving prediction performance. Researchers from various disciplines such as statistics, machine learning, pattern recognition, and data mining have considered the use of ensemble methodology. Stacking is a general ensemble method in which a nu...
متن کاملIs Combining Classifiers Better than Selecting the Best One
We empirically evaluate several state-of-theart methods for constructing ensembles of heterogeneous classifiers with stacking and show that they perform (at best) comparably to selecting the best classifier from the ensemble by cross validation. We then propose a new method for stacking, that uses multi-response model trees at the meta-level, and show that it clearly outperforms existing stacki...
متن کامل