Some improvements on NN based classifiers in metric spaces
نویسندگان
چکیده
The nearest neighbour (NN) and k-nearest neighbour (k-NN) classification rules have been widely used in Pattern Recognition due to its simplicity and good behaviour. Exhaustive nearest neighbour search may become unpractical when facing large training sets, high dimensional data or expensive dissimilarity measures (distances). During the last years a lot of fast NN search algorithms have been developed to overcome those problems, and many of them are based on traversing a data structure (usually a tree) testing several candidates until the nearest neighbour is found. When these algorithms are extended to find the k nearest neighbours, the classification time increases with the value of k. In this paper we propose a new classification rule that makes use of the prototypes that are selected by these algorithms in a 1-NN search as candidates to nearest neighbour. To illustrate the behaviour of this rule, several fast and widely known NN search algorithms have been extended with it, obtaining classification results similar to those of a k-NN (k > 1) classifier without the extra computational overhead. Also, previous work on approximate NN search for vector spaces has been extended to algorithms suitable for general metric spaces, and has been combined with the new classification rule.
منابع مشابه
Remarks on some recent M. Borcut's results in partially ordered metric spaces
In this paper, some recent results established by Marin Borcut [M. Borcut, Tripled fixed point theorems for monotone mappings in partially ordered metric spaces, Carpathian J. Math. 28, 2 (2012), 207--214] and [M. Borcut, Tripled coincidence theorems for monotone mappings in partially ordered metric spaces, Creat. Math. Inform. 21, 2 (2012), 135--142] are generalized and improved, with much sho...
متن کاملNon-Euclidean or Non-metric Measures Can Be Informative
Statistical learning algorithms often rely on the Euclidean distance. In practice, non-Euclidean or non-metric dissimilarity measures may arise when contours, spectra or shapes are compared by edit distances or as a consequence of robust object matching [1,2]. It is an open issue whether such measures are advantageous for statistical learning or whether they should be constrained to obey the me...
متن کاملRational Geraghty Contractive Mappings and Fixed Point Theorems in Ordered $b_2$-metric Spaces
In 2014, Zead Mustafa introduced $b_2$-metric spaces, as a generalization of both $2$-metric and $b$-metric spaces. Then new fixed point results for the classes of rational Geraghty contractive mappings of type I,II and III in the setup of $b_2$-metric spaces are investigated. Then, we prove some fixed point theorems under various contractive conditions in partially ordered $b_2$-metric spaces...
متن کاملA Gradient-Based Metric Learning Algorithm for k-NN Classifiers
The Nearest Neighbor (NN) classification/regression techniques, besides their simplicity, are amongst the most widely applied and well studied techniques for pattern recognition in machine learning. A drawback, however, is the assumption of the availability of a suitable metric to measure distances to the k nearest neighbors. It has been shown that k-NN classifiers with a suitable distance metr...
متن کامل$C$-class and $F(psi,varphi)$-contractions on $M$-metric spaces
Partial metric spaces were introduced by Matthews in 1994 as a part of the study of denotational semantics of data flow networks. In 2014 Asadi and {it et al.} [New Extension of $p$-Metric Spaces with Some fixed point Results on $M$-metric paces, J. Ineq. Appl. 2014 (2014): 18] extend the Partial metric spaces to $M$-metric spaces. In this work, we introduce the class of $F(psi,varphi)$-contrac...
متن کامل