نتایج جستجو برای: weighted knn
تعداد نتایج: 105149 فیلتر نتایج به سال:
With the recent development in mobile computing devices and as the ubiquitous deployment of access points(APs) of Wireless Local Area Networks(WLANs), WLAN based indoor localization systems(WILSs) are of mounting concentration and are becoming more and more prevalent for they do not require additional infrastructure. As to the localization methods in WILSs, for the approaches used to localizati...
K-Nearest Neighbor (KNN) is highly efficient classification algorithm due to its key features like: very easy to use, requires low training time, robust to noisy training data, easy to implement. However, it also has some shortcomings like high computational complexity, large memory requirement for large training datasets, curse of dimensionality and equal weights given to all attributes. Many ...
We present the work of the Democritus University of Thrace (DUTH) team in TREC’s 2016 Contextual Suggestion Track. The goal of the Contextual Suggestion Track is to build a system capable of proposing venues which a user might be interested to visit, using any available contextual and personal information. First, we enrich the TREC-provided dataset by collecting more information on venues from ...
Continuously monitoring kNN queries in a highly dynamic environment has become a necessity to many recent location-based applications. In this paper, we study the problem of continuous kNN query on the dataset with an in-memory grid index. We first present a novel data access method – CircularTrip. Then, an efficient CircularTrip-based continuous kNN algorithm is developed. Compared with the ex...
Several machine learning algorithms have been applied to the problem of static hand posture recognition. K-nearesr neighbor (KNN) performs very well in flexible posture recognition, but speed and memory requirements of the algorithm make it difficult to use in real time applications. In this paper we propose an approach to speed up the KNN without changing its behavior. We use the mixture of ga...
Random KNN (RKNN) is a novel generalization of traditional nearest-neighbor modeling. Random KNN consists of an ensemble of base k-nearest neighbor models, each constructed from a random subset of the input variables. A collection of r such base classifiers is combined to build the final Random KNN classifier. Since the base classifiers can be computed independently of one another, the overall ...
Abstract Kernelized Gram matrix $W$ constructed from data points $\{x_i\}_{i=1}^N$ as $W_{ij}= k_0( \frac{ \| x_i - x_j \|^2} {\sigma ^2} ) $ is widely used in graph-based geometric analysis and unsupervised learning. An important question how to choose the kernel bandwidth $\sigma $, a common practice called self-tuned adaptively sets _i$ at each point $x_i$ by $k$-nearest neighbor (kNN) dista...
Our final solution (RMSE=0.8712) consists of blending 107 individual results. Since many of these results are close variants, we first describe the main approaches behind them. Then, we will move to describing each individual result. The core components of the solution are published in our ICDM'2007 paper [1] (or, KDD-Cup'2007 paper [2]), and also in the earlier KDD'2007 paper [3]. We assume th...
Support vector machine (SVM) is one of the most powerful supervised learning algorithms in gene expression analysis. The samples intermixed in another class or in the overlapped boundary region may cause the decision boundary too complex and may be harmful to improve the precise of SVM. In the present paper, hybridized k-nearest neighbor (KNN) classifiers and SVM (HKNNSVM) is proposed to deal w...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید