نتایج جستجو برای: reproducing kernel hilbert space method
تعداد نتایج: 2079705 فیلتر نتایج به سال:
A new method for performing a kernel principal component analysis is proposed. By kernelizing the generalized Hebbian algorithm, one can iteratively estimate the principal components in a reproducing kernel Hilbert space with only linear order memory complexity. The derivation of the method and preliminary applications in image hyperresolution are presented. In addition, we discuss the extensio...
where (Ω,μ) is a probability space. The kernel K is used to generate a Hilbert space, known as a reproducing kernel Hilbert space, whose unit ball is the class of functions we investigate. Recall that if K is a positive definite function K : Ω×Ω → R, then by Mercer’s Theorem there is an orthonormal basis (φi)i=1 of L2(μ) such that μ× μ almost surely, K(x,y) = ∑i=1 λiφi(x)φi(y), where (λi)i=1 is...
Recent advances of kernel methods have yielded a framework for representing probabilities using a reproducing kernel Hilbert space, called kernel embedding of distributions. In this paper, we propose a Monte Carlo filtering algorithm based on kernel embeddings. The proposed method is applied to state-space models where sampling from the transition model is possible, while the observation model ...
Performance of the linear models, widely used within the framework of adaptive line enhancement (ALE), deteriorates dramatically in the presence of non-Gaussian noises. On the other hand, adaptive implementation of nonlinear models, e.g. the Volterra filters, suffers from the severe problems of large number of parameters and slow convergence. Nonetheless, kernel methods are emerging solutions t...
We develop an extension of the sliced inverse regression (SIR) framework for dimension reduction using kernel models and Tikhonov regularization. The result is a numerically stable nonlinear dimension reduction method. We prove consistency of the method under weak conditions even when the reproducing kernel Hilbert space induced by the kernel is infinite dimensional. We illustrate the utility o...
We consider the reproducing kernel Hilbert space Hμ induced by a kernel which is obtained using the Fourier-Stieltjes transform of a regular, positive, finite Borel measure μ on a locally compact abelian topological group Γ. Denote by G the dual of Γ. We determine Hμ as a certain subspace of the space C0(G) of all continuous function on G vanishing at infinity. Our main application is calculati...
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the latent variables (components). However, in contrast to PCR, PLS creates the components by modeling th...
We analyze the regularized least square algorithm in learning theory with Reproducing Kernel Hilbert Spaces (RKHS). Explicit convergence rates for the regression and binary classification problems are obtained in particular for the polynomial and Gaussian kernels on the n-dimensional sphere and the hypercube. There are two major ingredients in our approach: (i) a law of large numbers for Hilber...
In this paper, interpolating curve or surface with linear inequality constraints is considered as a general convex optimization problem in a Reproducing Kernel Hilbert Space. We propose a new approximation method based on a discretized optimization problem in a finite-dimensional Hilbert space under the same set of constraints. We prove that the approximate solution converges uniformly to the o...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید