نتایج جستجو برای: nearest neighbor classification
تعداد نتایج: 524866 فیلتر نتایج به سال:
Classification of spatial data streams is crucial, since the training dataset changes often. Building a new classifier each time can be very costly with most techniques. In this situation, k-nearest neighbor (KNN) classification is a very good choice, since no residual classifier needs to be built ahead of time. KNN is extremely simple to implement and lends itself to a wide variety of variatio...
In this paper we introduce a new embedding technique to linearly project labeled data samples into a new space where the performance of a Nearest Neighbor classifier is improved. The approach is based on considering a large set of simple discriminant projections and finding the subset with higher classification performance. In order to implement the feature selection process we propose the use ...
In this part, you will implement k-Nearest Neighbor (k-NN) algorithm on the 8scenes category dataset of Oliva and Torralba [1]. You are given a total of 800 labeled training images (containing 100 images for each class) and 1888 unlabeled testing images. Figure 1 shows some sample images from the data set. Your task is to analyze the performance of k-NN algorithm in classifying photographs into...
In molecular databases, structural classification is a basic task that can be successfully approached by nearest neighbor methods. The underlying similarity models consider spatial properties such as shape and extension as well as thematic attributes. We introduce 3D shape histograms as an intuitive and powerful approach to model similarity for solid objects such as molecules. Errors of measure...
The K-nearest neighbor (KNN) decision rule has been a ubiquitous classification tool with good scalability. Past experience has shown that the optimal choice of K depends upon the data, making it laborious to tune the parameter for different applications. We introduce a new metric that measures the informativeness of objects to be classified. When applied as a query-based distance metric to mea...
ÐNearest-neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest-neighbor rule. We propose a locally adaptive nearest-neighbor classification method to try to minimize bias. We use a Chi-s...
Nearest neighbor (NN) classification assumes locally constant class conditional probabilities, and suffers from bias in high dimensions with a small sample set. In this paper, we propose a novel cam weighted distance to ameliorate the curse of dimensionality. Different from the existing neighborhood-based methods which only analyze a small space emanating from the query sample, the proposed nea...
Nearest neighbor algorithms can be implemented on content-based image retrieval (CBIR) and classification problems for extracting similar images. In k-nearest neighbor (k-NN), the winning class is based on the k nearest neighbors determined by comparing the query image against all training samples. In this paper, a new nearest neighbor search (NNS) algorithm is proposed using a two-step process...
In this paper, classification efficiency of the conventional K-nearest neighbor algorithm is enhanced by exploiting fuzzy-rough uncertainty. The simplicity and nonparametric characteristics of the conventional K-nearest neighbor algorithm remain intact in the proposed algorithm. Unlike the conventional one, the proposed algorithm does not need to know the optimal value of K. Moreover, the gener...
Robert Tibshirani Department of Statistics University of Toronto tibs@utstat .toronto.edu Nearest neighbor classification expects the class conditional probabilities to be locally constant, and suffers from bias in high dimensions We propose a locally adaptive form of nearest neighbor classification to try to finesse this curse of dimensionality. We use a local linear discriminant analysis to e...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید