Feature Evaluation using Quadratic Mutual Information
نویسنده
چکیده
Methods of feature evaluation are developed and discussed based on information theoretical learning (ITL). Mutual information was shown in the literature to be more robust and precise to evaluate a feature set. In this paper; we propose to use quadratic mutual information (QMI) for feature evaluation. The concept of information potential leads to a more clearly physical meaning of the evaluation functions. Moreover; evaluation for feature sets in high-dimensional space could also be implemented efJciently. Experiment results are compared to classifier performances. 1 Quadratic mutual information The uncertainty for a message can be measured by Shannon’s Entropy as: N (1) k = I N where z p P k = l , p k > O . when using Renyi’s entropy, the differential version is shown as [4]: k = I
منابع مشابه
Quadratic Mutual Information Feature Selection
We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian kernel functions, and can detect second order non-linear relations. Its main advantages are: (i) unified analysis of discrete and continuous dat...
متن کاملAn improvement direction for filter selection techniques using information theory measures and quadratic optimization
Filter selection techniques are known for their simplicity and efficiency. However this kind of methods doesn’t take into consideration the features inter-redundancy. Consequently the un-removed redundant features remain in the final classification model, giving lower generalization performance. In this paper we propose to use a mathematical optimization method that reduces inter-features redun...
متن کاملFeature Selection with Non-Parametric Mutual Information for Adaboost Learning
This paper describes a feature selection method based on the quadratic mutual information. We describe the needed formulation to estimate the mutual information from the data. This paper is motivated for the high time cost of the training process using the classical boosting algorithms. This method allows to reuse part of the training time used in the first training process to speed up posterio...
متن کاملFeature Selection Using Multi Objective Genetic Algorithm with Support Vector Machine
Different approaches have been proposed for feature selection to obtain suitable features subset among all features. These methods search feature space for feature subsets which satisfies some criteria or optimizes several objective functions. The objective functions are divided into two main groups: filter and wrapper methods. In filter methods, features subsets are selected due to some measu...
متن کاملFeature Extraction by Non-Parametric Mutual Information Maximization
We present a method for learning discriminative feature transforms using as criterion the mutual information between class labels and transformed features. Instead of a commonly used mutual information measure based on Kullback-Leibler divergence, we use a quadratic divergence measure, which allows us to make an efficient non-parametric implementation and requires no prior assumptions about cla...
متن کامل