Nearest Neighbor Classification with Locally Weighted Distance for Imbalanced Data

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improving nearest neighbor classification with cam weighted distance

Nearest neighbor (NN) classification assumes locally constant class conditional probabilities, and suffers from bias in high dimensions with a small sample set. In this paper, we propose a novel cam weighted distance to ameliorate the curse of dimensionality. Different from the existing neighborhood-based methods which only analyze a small space emanating from the query sample, the proposed nea...

متن کامل

A Hybrid Weighted Nearest Neighbor Approach to Mine Imbalanced Data

Classification of imbalanced data has drawn significant attention from research community in last decade. As the distribution of data into various classes affects the performances of traditional classifiers, the imbalanced data needs special treatment. Modification in learning approaches is one of the solutions to deal with such cases. In this paper a hybrid nearest neighbor learning approach i...

متن کامل

Improving k Nearest Neighbor with Exemplar Generalization for Imbalanced Classification

A k nearest neighbor (kNN) classifier classifies a query instance to the most frequent class of its k nearest neighbors in the training instance space. For imbalanced class distribution, a query instance is often overwhelmed by majority class instances in its neighborhood and likely to be classified to the majority class. We propose to identify exemplar minority class training instances and gen...

متن کامل

Locally Adaptive Metric Nearest-Neighbor Classification

ÐNearest-neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest-neighbor rule. We propose a locally adaptive nearest-neighbor classification method to try to minimize bias. We use a Chi-s...

متن کامل

Nearest Neighbor Classification with Improved Weighted Dissimilarity Measure

The usefulness and the efficiency of the k nearest neighbor classification procedure are well known. A less sophisticated method consists in using only the first nearby prototype. This means k=1 and it is the method applied in this paper. One way to get a proper result is to use weighted dissimilarities implemented with a distance function of the prototype space. To improve the classification a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Computer and Communication Engineering

سال: 2014

ISSN: 2010-3743

DOI: 10.7763/ijcce.2014.v3.296