نتایج جستجو برای: kernel trick

تعداد نتایج: 52726  

Journal: :CoRR 2016
Amir Globerson Roi Livni

Infinite–Layer Networks (ILN) have recently been proposed as an architecture that mimics neural networks while enjoying some of the advantages of kernel methods. ILN are networks that integrate over infinitely many nodes within a single hidden layer. It has been demonstrated by several authors that the problem of learning ILN can be reduced to the kernel trick, implying that whenever a certain ...

Journal: :JACIII 2004
Hidetomo Ichihashi Katsuhiro Honda

Support vector machines (SVM), kernel principal component analysis (KPCA), and kernel Fisher discriminant analysis (KFD), are examples of successful kernel-based learning methods. By the addition of a regularizer and the kernel trick to a fuzzy counterpart of Gaussian mixture density models (GMM), this paper proposes a clustering algorithm in an extended high dimensional feature space. Unlike t...

2017
Roi Livni Daniel Carmon Amir Globerson

Infinite Layer Networks (ILN) have been proposed as an architecture that mimics neural networks while enjoying some of the advantages of kernel methods. ILN are networks that integrate over infinitely many nodes within a single hidden layer. It has been demonstrated by several authors that the problem of learning ILN can be reduced to the kernel trick, implying that whenever a certain integral ...

2013
Gaurav Sharma Frédéric Jurie

The kernel trick – commonly used in machine learning and computer vision – enables learning of non-linear decision functions without having to explicitly map the original data to a high dimensional space. However, at test time, it requires evaluating the kernel with each one of the support vectors, which is time consuming. In this paper, we propose a novel approach for learning non-linear SVM c...

2013
Gaurav Sharma Frederic Jurie

The kernel trick – commonly used in machine learning and computer vision – enables learning of non-linear decision functions without having to explicitly map the original data to a high dimensional space. However, at test time, it requires evaluating the kernel with each one of the support vectors, which is time consuming. In this paper, we propose a novel approach for learning non-linear SVM c...

Journal: :Neurocomputing 2005
Zhan-Li Sun De-Shuang Huang Chun-Hou Zheng Li Shang

By combining the batch algorithm with the kernel trick, an improved kernel blind source separation (IKBSS) is presented. The IKBSS has not only a better performance but also a less computational complexity compared to the original kernel blind source separation (KBSS).

2017
Mehran Kafai Kave Eshghi

Kernel methods have been shown to be effective for many machine learning tasks such as classification and regression. In particular, support vector machines with the Gaussian kernel have proved to be powerful classification tools. The standard way to apply kernel methods is to use the kernel trick, where the inner product of the vectors in the feature space is computed via the kernel function. ...

2009
Ratthachat Chatpatanasiri Pasakorn Tangchanachaianan Boonserm Kijsirikul

This paper focuses on the problem of kernelizing an existing supervised Mahalanobis distance learner. The following features are included in the paper. Firstly, three popular learners, namely, “neighborhood component analysis”, “large margin nearest neighbors” and “discriminant neighborhood embedding”, which do not have kernel versions are kernelized in order to improve their classification per...

2010
Yutian Chen Max Welling Alexander J. Smola

We extend the herding algorithm to continuous spaces by using the kernel trick. The resulting “kernel herding” algorithm is an infinite memory deterministic process that learns to approximate a PDF with a collection of samples. We show that kernel herding decreases the error of expectations of functions in the Hilbert space at a rateO(1/T )which is much faster than the usual O(1/ √ T ) for iid ...

Journal: :International Journal of Computer Applications 2015

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید