Learning Nearest-Neighbor Classifiers with Hyperkernels

نویسندگان

  • Hua Ouyang
  • Alexander Gray
چکیده

We consider improving the performance of k-Nearest Neighbor classifiers. A regularized kNN is proposed to learn an optimal dissimilarity function to substitute the Euclidean metric. The learning process employs hyperkernels and shares a similar regularization framework as support vector machines (SVM). Its performance is shown to be consistently better than kNN, and is competitive with SVM.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Classification Method for E-mail Spam Using a Hybrid Approach for Feature Selection Optimization

Spam is an unwanted email that is harmful to communications around the world. Spam leads to a growing problem in a personal email, so it would be essential to detect it. Machine learning is very useful to solve this problem as it shows good results in order to learn all the requisite patterns for classification due to its adaptive existence. Nonetheless, in spam detection, there are a large num...

متن کامل

Multiple Views in Ensembles of Nearest Neighbor Classifiers

Multi-view classification is a machine learning methodology when patterns or objects of interest are represented by a set of different views (sets of features) rather than the union of all views. In this paper, multiple views are employed in ensembles of nearest neighbor classifiers where they demonstrate promising results in classifying a challenging data set of protein folds. In particular, u...

متن کامل

Prototype reduction techniques: A comparison among different approaches

The main two drawbacks of nearest neighbor based classifiers are: high CPU costs when the number of samples in the training set is high and performance extremely sensitive to outliers. Several attempts of overcoming such drawbacks have been proposed in the pattern recognition field aimed at selecting/gen-erating an adequate subset of prototypes from the training set. The problem addressed in th...

متن کامل

Adaptive Nearest Neighbor Classifier Based on Supervised Ellipsoid Clustering

Nearest neighbor classifier is a widely-used effective method for multi-class problems. However, it suffers from the problem of the curse of dimensionality in high dimensional space. To solve this problem, many adaptive nearest neighbor classifiers were proposed. In this paper, a locally adaptive nearest neighbor classification method based on supervised learning style which works well for the ...

متن کامل

ACO Based Feature Subset Selection for Multiple k-Nearest Neighbor Classifiers

The k-nearest neighbor (k-NN) is one of the most popular algorithms used for classification in various fields of pattern recognition & data mining problems. In k-nearest neighbor classification, the result of a new instance query is classified based on the majority of k-nearest neighbors. Recently researchers have begun paying attention to combining a set of individual k-NN classifiers, each us...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008