Combining Classifier Guided by Semi-Supervision
Authors
Abstract:
The article suggests an algorithm for regular classifier ensemble methodology. The proposed methodology is based on possibilistic aggregation to classify samples. The argued method optimizes an objective function that combines environment recognition, multi-criteria aggregation term and a learning term. The optimization aims at learning backgrounds as solid clusters in subspaces of the high-dimensional feature-space via an unsupervised learning including an attribute discrimination component. The unsupervised clustering component assigns degree of typicality to each data pattern in order to identify and reduce the effect of noisy or outlaid data patterns. Then, the suggested technique obtains the best combination parameters for each background. The experimentations on artificial datasets and standard SONAR dataset demonstrate that our classifier ensemble does better than individual classifiers in the ensemble.
similar resources
combining classifier guided by semi-supervision
the article suggests an algorithm for regular classifier ensemble methodology. the proposed methodology is based on possibilistic aggregation to classify samples. the argued method optimizes an objective function that combines environment recognition, multi-criteria aggregation term and a learning term. the optimization aims at learning backgrounds as solid clusters in subspaces of the high-dim...
full textGuiding InfoGAN with Semi-supervision
In this paper we propose a new semi-supervised GAN architecture (ss-InfoGAN) for image synthesis that leverages information from few labels (as little as 0.22%, max. 10% of the dataset) to learn semantically meaningful and controllable data representations where latent variables correspond to label categories. The architecture builds on Information Maximizing Generative Adversarial Networks (In...
full textExperiments with Classifier Combining Rules
A large experiment on combining classifiers is reported and discussed. It includes, both, the combination of different classifiers on the same feature set and the combination of classifiers on different feature sets. Various fixed and trained combining rules are used. It is shown that there is no overall winning combining rule and that bad classifiers as well as bad feature sets may contain val...
full textextremal region detection guided by maxima of gradient magnitude
a problem of computer vision applications is to detect regions of interest under dif- ferent imaging conditions. the state-of-the-art maximally stable extremal regions (mser) detects affine covariant regions by applying all possible thresholds on the input image, and through three main steps including: 1) making a component tree of extremal regions’ evolution (enumeration), 2) obtaining region ...
Scalable Semi-Supervised Classifier Aggregation
We present and empirically evaluate an efficient algorithm that learns to aggregate the predictions of an ensemble of binary classifiers. The algorithm uses the structure of the ensemble predictions on unlabeled data to yield significant performance improvements. It does this without making assumptions on the structure or origin of the ensemble, without parameters, and as scalably as linear lea...
full textMy Resources
Journal title
volume 8 issue 1
pages 27- 50
publication date 2017-02-01
By following a journal you will be notified via email when a new issue of this journal is published.
Hosted on Doprax cloud platform doprax.com
copyright © 2015-2023