نتایج جستجو برای: distance metric learning
تعداد نتایج: 886297 فیلتر نتایج به سال:
Application of document clustering techniques to cluster e-mails is an interesting application. Techniques like kmeans, EM etc can be used to achieve this. However, the selection of a good distance metric is the key issue involved. Often people manually tweak the chosen distance metric to achieve desirable/good clusters/results that in all certainty do not provide a generic solution. Hence it w...
We first investigate the combined effect of data complexity, curse of dimensionality and the definition of the Euclidean distance on the distance measure between points. Then, based on the concepts underlying manifold learning algorithms and the minimum volume ellipsoid metric, we design an algorithm that learns a local metric on the lower dimensional manifold on which the data is lying. Experi...
in this paper, we introduce a concept of a generalized $c$-distance in ordered cone $b$-metric spaces and, by using the concept, we prove some fixed point theorems in ordered cone $b$-metric spaces. our results generalize the corresponding results obtained by y. j. cho, r. saadati, shenghua wang (y. j. cho, r. saadati, shenghua wang, common fixed point heorems on generalized distance in ordere...
in this paper, we give some results on the common fixed point of self-mappings defined on complete $b$-metric spaces. our results generalize kannan and chatterjea fixed point theorems on complete $b$-metric spaces. in particular, we show that two self-mappings satisfying a contraction type inequality have a unique common fixed point. we also give some examples to illustrate the given results.
the aim of this paper is to establish random coincidence point results for weakly increasing random operators in the setting of ordered metric spaces by using generalized altering distance functions. our results present random versions and extensions of some well-known results in the current literature.
Learning a proper distance metric is crucial for many computer vision and image classification applications. Neighborhood Components Analysis (NCA) is an effective distance metric learning method which maximizes the kNN leave-out-one score on the training data by considering visual similarity between images. However, only using visual similarity to learn image distances could not satisfactorily...
in this paper, we shall introduce the fuzzyw-distance, then prove a common fixed point theorem with respectto fuzzy w-distance for two mappings under the condition ofweakly compatible in complete fuzzy metric spaces.
In this paper we formulate multiple kernel learning (MKL) as a distance metric learning (DML) problem. More specifically, we learn a linear combination of a set of base kernels by optimising two objective functions that are commonly used in distance metric learning. We first propose a global version of such an MKL via DML scheme, then a localised version. We argue that the localised version not...
Our submissions (ML1, ML2, ML3) to the Audio Music Similarity (AMS) task are based upon learning an optimal distance metric over vector quantized MFCC histograms. ML1 is optimized to predict similarity derived from a collaborative filter; ML2 is optimized to predict genre similarity; ML3 is an unsupervised baseline which uses a native distance metric. This abstract details the system architectu...
We propose a general information-theoretic approach to semi-supervised metric learning called SERAPH (SEmi-supervised metRic leArning Paradigm with Hypersparsity) that does not rely on the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize its entropy on labeled data and minimize its entropy on unlabeled data following entropy regularization. For met...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید