Margin Maximizing Discriminant Analysis
نویسندگان
چکیده
We propose a new feature extraction method called Margin Maximizing Discriminant Analysis (MMDA) which seeks to extract features suitable for classification tasks. MMDA is based on the principle that an ideal feature should convey the maximum information about the class labels and it should depend only on the geometry of the optimal decision boundary and not on those parts of the distribution of the input data that do not participate in shaping this boundary. Further, distinct feature components should convey unrelated information about the data. Two feature extraction methods are proposed for calculating the parameters of such a projection that are shown to yield equivalent results. The kernel mapping idea is used to derive non-linear versions. Experiments with several real-world, publicly available data sets demonstrate that the new method yields competitive results.
منابع مشابه
Adaptive Quasiconformal Kernel Fisher Discriminant Analysis via Weighted Maximum Margin Criterion
Kernel Fisher discriminant analysis (KFD) is an effective method to extract nonlinear discriminant features of input data using the kernel trick. However, conventional KFD algorithms endure the kernel selection problem as well as the singular problem. In order to overcome these limitations, a novel nonlinear feature extraction method called adaptive quasiconformal kernel Fisher discriminant ana...
متن کاملRisk Minimization and Minimum Description for Linear Discriminant Functions
Statistical learning theory provides a formal criterion for learning a concept from examples. This theory addresses directly the tradeoff in empirical fit and generalization. In practice, this leads to the structural risk minimization principle where one minimizes a bound on the overall risk functional. For learning linear discriminant functions, this bound is impacted by the minimum of two ter...
متن کاملMultiple Kernel Learning in Fisher Discriminant Analysis for Face Recognition
Recent applications and developments based on support vector machines (SVMs) have shown that using multiple kernels instead of a single one can enhance classifier performance. However, there are few reports on performance of the kernel‐based Fisher discriminant analysis (kernel‐based FDA) method with multiple kernels. This paper proposes a multiple kernel construction ...
متن کاملMaxi-Min discriminant analysis via online learning
Linear Discriminant Analysis (LDA) is an important dimensionality reduction algorithm, but its performance is usually limited on multi-class data. Such limitation is incurred by the fact that LDA actually maximizes the average divergence among classes, whereby similar classes with smaller divergence tend to be merged in the subspace. To address this problem, we propose a novel dimensionality re...
متن کاملMargin Maximizing Loss Functions
Margin maximizing properties play an important role in the analysis of classi£cation models, such as boosting and support vector machines. Margin maximization is theoretically interesting because it facilitates generalization error analysis, and practically interesting because it presents a clear geometric interpretation of the models being built. We formulate and prove a suf£cient condition fo...
متن کامل