Annealed Discriminant Analysis
نویسندگان
چکیده
Motivated by the analogies to statistical physics, the deterministic annealing (DA) method has successfully been demonstrated in a variety of application. In this paper, we explore a new methodology to devise the classifier under the DA method. The differential cost function is derived subject to a constraint on the randomness of the solution, which is governed by the temperature T . While gradually lowering the temperature, we can always find a good solution which can both solve the overfitting problem and avoid poor local optima. Our approach is called annealed discriminant analysis (ADA). It is a general approach, where we elaborate two classifiers, i.e., distance-based and inner product-based, in this paper. The distance-based classifier is an annealed version of linear discriminant analysis (LDA) while the inner product-based classifier is a generalization of penalized logistic regression (PLR). As such, ADA provides new insights into the workings of these two classification algorithms. The experimental results show substantial performance gains over standard learning methods.
منابع مشابه
A direct approach to sparse discriminant analysis in ultra-high dimensions
Sparse discriminant methods based on independence rules, such as the nearest shrunken centroids classifier (Tibshirani et al., 2002) and features annealed independence rules (Fan & Fan, 2008), have been proposed as computationally attractive tools for feature selection and classification with high-dimensional data. A fundamental drawback of these rules is that they ignore correlations among fea...
متن کاملJ an 2 00 7 High Dimensional Classification Using Features Annealed Independence Rules ∗
Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is largely poorly understood. In a seminal paper, Bickel and Levina (2004) show that the Fisher discriminant performs poorly due to diverging spectra and they propose...
متن کاملHigh-dimensional Classification Using Features Annealed Independence Rules 1
Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is poorly understood. In a seminal paper, Bickel and Levina [Bernoulli 10 (2004) 989–1010] show that the Fisher discriminant performs poorly due to diverging spectra ...
متن کاملHigh Dimensional Classification Using Features Annealed Independence Rules.
Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is largely poorly understood. In a seminal paper, Bickel and Levina (2004) show that the Fisher discriminant performs poorly due to diverging spectra and they propose...
متن کاملFace Recognition by Cognitive Discriminant Features
Face recognition is still an active pattern analysis topic. Faces have already been treated as objects or textures, but human face recognition system takes a different approach in face recognition. People refer to faces by their most discriminant features. People usually describe faces in sentences like ``She's snub-nosed'' or ``he's got long nose'' or ``he's got round eyes'' and so like. These...
متن کامل