A class-dependent weighted dissimilarity measure for nearest neighbor classification problems
نویسندگان
چکیده
A class-dependent weighted (CDW) dissimilarity measure in vector spaces is proposed to improve the performance of the nearest neighbor classifier. In order to optimize the required weights, an approach based on Fractional Programming is presented. Experiments with several standard benchmark data sets show the effectiveness of the proposed technique.
منابع مشابه
A Nearest Neighbor Weighted Measure In Classification Problems
A weighted dissimilarity measure in vectorial spaces is proposed to optimize the performance of the nearest neighbor classifier. An approach to find the required weights based on gradient descent is presented. Experiments with both synthetic and real data shows the effectiveness of the proposed technique.
متن کاملNearest Neighbor Classification with Improved Weighted Dissimilarity Measure
The usefulness and the efficiency of the k nearest neighbor classification procedure are well known. A less sophisticated method consists in using only the first nearby prototype. This means k=1 and it is the method applied in this paper. One way to get a proper result is to use weighted dissimilarities implemented with a distance function of the prototype space. To improve the classification a...
متن کاملComparison of Log-linear Models and Weighted Dissimilarity Measures
We compare two successful discriminative classification algorithms on three databases from the UCI and STATLOG repositories. The two approaches are the log-linear model for the class posterior probabilities and class-dependent weighted dissimilarity measures for nearest neighbor classifiers. The experiments show that the maximum entropy based log-linear classifier performs better for the equiva...
متن کاملOn the Use of Diagonal and Class-Dependent Weighted Distances for the Probabilistic k-Nearest Neighbor
A probabilistic k-nn (PKnn) method was introduced in [13] under the Bayesian point of view. This work showed that posterior inference over the parameter k can be performed in a relatively straightforward manner using Markov Chain Monte Carlo (MCMC) methods. This method was extended by Everson and Fieldsen [14] to deal with metric learning. In this work we propose two different dissimilarities f...
متن کاملAsymptotic derivation of the finite - sample risk of the k nearest neighbor classifier ∗ ( Technical Report UVM – CS – 1998 – 0101 )
The finite-sample risk of the k nearest neighbor classifier that uses a weighted Lpmetric as a measure of class similarity is examined. For a family of classification problems with smooth distributions in Rn, an asymptotic expansion for the risk is obtained in decreasing fractional powers of the reference sample size. An analysis of the leading expansion coefficients reveals that the optimal we...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Pattern Recognition Letters
دوره 21 شماره
صفحات -
تاریخ انتشار 2000