نتایج جستجو برای: مدل میانگینگیری knn

تعداد نتایج: 124493  

Journal: :Journal of parallel and distributed computing 2007
Erion Plaku Lydia E. Kavraki

High-dimensional problems arising from robot motion planning, biology, data mining, and geographic information systems often require the computation of k nearest neighbor (knn) graphs. The knn graph of a data set is obtained by connecting each point to its k closest points. As the research in the above-mentioned fields progressively addresses problems of unprecedented complexity, the demand for...

2006
Anantaporn Srisawat Tanasanee Phienthrakul Boonserm Kijsirikul

This paper proposes SV-kNNC, a new algorithm for k-Nearest Neighbor (kNN). This algorithm consists of three steps. First, Support Vector Machines (SVMs) are applied to select some important training data. Then, k-mean clustering is used to assign the weight to each training instance. Finally, unseen examples are classified by kNN. Fourteen datasets from the UCI repository were used to evaluate ...

2003
Gongde Guo Hui Wang David A. Bell Yaxin Bi Kieran Greer

The k-Nearest-Neighbours (kNN) is a simple but effective method for classification. The major drawbacks with respect to kNN are (1) its low efficiency being a lazy learning method prohibits it in many applications such as dynamic web mining for a large repository, and (2) its dependency on the selection of a “good value” for k. In this paper, we propose a novel kNN type method for classificatio...

2015
Barbara Malič Jurij Koruza Jitka Hreščak Janez Bernard Ke Wang John G. Fisher Andreja Benčan

The potassium sodium niobate, K0.5Na0.5NbO₃, solid solution (KNN) is considered as one of the most promising, environment-friendly, lead-free candidates to replace highly efficient, lead-based piezoelectrics. Since the first reports of KNN, it has been recognized that obtaining phase-pure materials with a high density and a uniform, fine-grained microstructure is a major challenge. For this rea...

Journal: :PVLDB 2015
Yongjoo Park Michael J. Cafarella Barzan Mozafari

Approximate kNN (k-nearest neighbor) techniques using binary hash functions are among the most commonly used approaches for overcoming the prohibitive cost of performing exact kNN queries. However, the success of these techniques largely depends on their hash functions’ ability to distinguish kNN items; that is, the kNN items retrieved based on data items’ hashcodes, should include as many true...

2002
Guiwen Hou Jingyue Zhang Jiahong Zhou

In this project we present a heuristic learning process by training ANN (artificial neural network) and KNN (k-nearest neighbor) using the best number of steps, gained from A*, from randomly generated states to the goal. After training ANN and KNN, the mixture of Experts is discussed and the empirical data are collected to demonstrate the feasibility and accuracy of combination of ANN and KNN i...

2012
Shivendra Tiwari Saroj Kaushik

Interpretability, usability, and boundary points detection of the clustering results are of fundamental importance. The cluster boundary detection is important for many real world applications, such as geo-spatial data analysis and point-based computer graphics. In this paper, we have proposed an efficient solution for finding boundary points in multi-dimensional datasets that uses the BORDER [...

1993
Dietrich Wettschereck Thomas G. Dietterich

Four versions of a k-nearest neighbor algorithm with locally adaptive k are introduced and compared to the basic k-nearest neighbor algorithm (kNN). Locally adaptive kNN algorithms choose the value of k that should be used to classify a query by consulting the results of cross-validation computations in the local neighborhood of the query. Local kNN methods are shown to perform similar to kNN i...

Journal: :CoRR 2012
Yanshan Shi

K-nearest neighbors (KNN) method is used in many supervised learning classification problems. Potential Energy (PE) method is also developed for classification problems based on its physical metaphor. The energy potential used in the experiments are Yukawa potential and Gaussian Potential. In this paper, I use both applet and MATLAB program with real life benchmark data to analyze the performan...

2009
Yoan Miché Amaury Lendasse

The Optimally Pruned Extreme Learning Machine (OPELM) and Optimally Pruned K-Nearest Neighbors (OP-KNN) algorithms use the a similar methodology based on random initialization (OP-ELM) or KNN initialization (OP-KNN) of a Feedforward Neural Network followed by ranking of the neurons; ranking is used to determine the best combination to retain. This is achieved by Leave-One-Out (LOO) crossvalidat...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید