نتایج جستجو برای: kernel trick
تعداد نتایج: 52726 فیلتر نتایج به سال:
Infinite–Layer Networks (ILN) have recently been proposed as an architecture that mimics neural networks while enjoying some of the advantages of kernel methods. ILN are networks that integrate over infinitely many nodes within a single hidden layer. It has been demonstrated by several authors that the problem of learning ILN can be reduced to the kernel trick, implying that whenever a certain ...
Support vector machines (SVM), kernel principal component analysis (KPCA), and kernel Fisher discriminant analysis (KFD), are examples of successful kernel-based learning methods. By the addition of a regularizer and the kernel trick to a fuzzy counterpart of Gaussian mixture density models (GMM), this paper proposes a clustering algorithm in an extended high dimensional feature space. Unlike t...
Infinite Layer Networks (ILN) have been proposed as an architecture that mimics neural networks while enjoying some of the advantages of kernel methods. ILN are networks that integrate over infinitely many nodes within a single hidden layer. It has been demonstrated by several authors that the problem of learning ILN can be reduced to the kernel trick, implying that whenever a certain integral ...
The kernel trick – commonly used in machine learning and computer vision – enables learning of non-linear decision functions without having to explicitly map the original data to a high dimensional space. However, at test time, it requires evaluating the kernel with each one of the support vectors, which is time consuming. In this paper, we propose a novel approach for learning non-linear SVM c...
The kernel trick – commonly used in machine learning and computer vision – enables learning of non-linear decision functions without having to explicitly map the original data to a high dimensional space. However, at test time, it requires evaluating the kernel with each one of the support vectors, which is time consuming. In this paper, we propose a novel approach for learning non-linear SVM c...
By combining the batch algorithm with the kernel trick, an improved kernel blind source separation (IKBSS) is presented. The IKBSS has not only a better performance but also a less computational complexity compared to the original kernel blind source separation (KBSS).
Kernel methods have been shown to be effective for many machine learning tasks such as classification and regression. In particular, support vector machines with the Gaussian kernel have proved to be powerful classification tools. The standard way to apply kernel methods is to use the kernel trick, where the inner product of the vectors in the feature space is computed via the kernel function. ...
This paper focuses on the problem of kernelizing an existing supervised Mahalanobis distance learner. The following features are included in the paper. Firstly, three popular learners, namely, “neighborhood component analysis”, “large margin nearest neighbors” and “discriminant neighborhood embedding”, which do not have kernel versions are kernelized in order to improve their classification per...
We extend the herding algorithm to continuous spaces by using the kernel trick. The resulting “kernel herding” algorithm is an infinite memory deterministic process that learns to approximate a PDF with a collection of samples. We show that kernel herding decreases the error of expectations of functions in the Hilbert space at a rateO(1/T )which is much faster than the usual O(1/ √ T ) for iid ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید