نتایج جستجو برای: nearest neighbors
تعداد نتایج: 43351 فیلتر نتایج به سال:
In this paper, we propose a novel scheme for approximate nearest neighbor (ANN) retrieval based on dictionary learning and sparse coding. Our key innovation is to build compact codes, dubbed SpANN codes, using the active set of sparse coded data. These codes are then used to index an inverted file table for fast retrieval. The active sets are often found to be sensitive to small differences amo...
Spectral clustering is a method of subspace clustering which is suitable for the data of any shape and converges to global optimal solution. By combining concepts of shared nearest neighbors and geodesic distance with spectral clustering, a self-adaptive spectral clustering based on geodesic distance and shared nearest neighbors was proposed. Experiments show that the improved spectral clusteri...
The simple k nearest neighbor method is often very competitive, especially in classiication methods. When the number of predictors is large, the nearest neighbors are likely to be quite distant from the target point. Furthermore they tend to all be on one side of the target point. These are consequences of high dimensional geometry. This paper introduces a modiication of nearest neighbors that ...
An important part of Pattern Recognition deals with the problem of classification of data into a finite number of categories. In the usual setting of " supervised learning " , examples are given that consists of pairs, (X i , Y i), i ≤ n, where X i is the d-dimensional covariate vector and y i is the corresponding " category " in some finite set C. In the examples, y i is known! Based on these ...
A fundamental question of machine learning is how to compare examples. If an algorithm could perfectly determine whether two examples were semantically similar or dissimilar, most subsequent machine learning tasks would become trivial (i.e, the 1-nearest-neighbor classifier will achieve perfect results). A common choice for a dissimilarity measurement is an uninformed norm, like the Euclidean d...
The nearest neighbor (NN) classiiers, especially the k-NN algorithm, are among the simplest and yet most eecient classiication rules and are widely used in practice. We introduce three adaptation rules that can be used in iterative training of a k-NN classiier. This is a novel approach both from the statistical pattern recognition and the supervised neural network learning points of view. The s...
Given a set P of N points in a ddimensional space, along with a query point q, it is often desirable to find k points of P that are with high probability close to q. This is the Approximate k-NearestNeighbors problem. We present two algorithms for AkNN. Both require O(Nd) preprocessing time. The first algorithm has a query time cost that is O(d+logN), while the second has a query time cost that...
We consider tradeoffs between the query and update complexities for the (approximate) nearest neighbor problem on the sphere, extending the spherical filters recently introduced by [Becker–Ducas–Gama– Laarhoven, SODA’16] to sparse regimes and generalizing the scheme and analysis to account for different tradeoffs. In a nutshell, for the sparse regime the tradeoff between the query complexity nq...
While classic machine learning paradigms assume training and test data are generated from the same process, domain adaptation addresses the more realistic setting in which the learner has large quantities of labeled data from some source task but limited or no labeled data from the target task it is attempting to learn. In this work, we give the first formal analysis showing that using active l...
Complex networks, such as biological, social, and communication networks, often entail uncertainty, and thus, can be modeled as probabilistic graphs. Similar to the problem of similarity search in standard graphs, a fundamental problem for probabilistic graphs is to efficiently answer k-nearest neighbor queries (k-NN), which is the problem of computing the k closest nodes to some specific node....
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید