نتایج جستجو برای: reproducing kernel
تعداد نتایج: 59574 فیلتر نتایج به سال:
Conditionally positive definite kernels provide a powerful tool for scattered data approximation. Many nice properties of such methods follow from an underlying reproducing kernel structure. While the connection between positive definite kernels and reproducing kernel Hilbert spaces is well understood, the analog relation between conditionally positive definite kernels and reproducing kernel Po...
The reproducing kernel function of a weighted Bergman space over domains in C is known explicitly in only a small number of instances. Here, we introduce a process of orthogonal norm expansion along a subvariety of codimension 1, which also leads to a series expansion of the reproducing kernel in terms of reproducing kernels defined on the subvariety. The problem of finding the reproducing kern...
In this paper, we consider sampling and reconstruction of signals in a reproducing kernel subspace of L(R), 1 ≤ p ≤ ∞, associated with an idempotent integral operator whose kernel has certain off-diagonal decay and regularity. The space of p-integrable non-uniform splines and the shift-invariant spaces generated by finitely many localized functions are our model examples of such reproducing ker...
In this paper, we consider sampling and reconstruction of signals in a reproducing kernel subspace of L(R), 1 ≤ p ≤ ∞, associated with an idempotent integral operator whose kernel has certain off-diagonal decay and regularity. The space of p-integrable non-uniform splines and the shift-invariant spaces generated by finitely many localized functions are our model examples of such reproducing ker...
Support vector regression (SVR) has been regarded as a state-of-the-art method for approximation and regression. The importance of kernel function, which is so-called admissible support vector kernel (SV kernel) in SVR, has motivated many studies on its composition. The Gaussian kernel (RBF) is regarded as a “best” choice of SV kernel used by non-expert in SVR, whereas there is no evidence, exc...
This paper considers the regularized learning algorithm associated with the leastsquare loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and the capacity of the reproducing kernel Hilbert...
In this paper, we present a new method for solving Reproducing Kernel Space (RKS) theory, and iterative algorithm for solving Generalized Burgers Equation (GBE) is presented. The analytical solution is shown in a series in a RKS, and the approximate solution u(x,t) is constructed by truncating the series. The convergence of u(x,t) to the analytical solution is also proved.
P (α) = C(α, F (x, y)) = αF (x, x) + 2αF (x, y) + F (x, y)F (y, y), which is ≥ 0. In the case F (x, x) = 0, the fact that P ≥ 0 implies that F (x, y) = 0. In the case F (x, y) 6= 0, P (α) is a quadratic polynomial and because P ≥ 0 it follows that the discriminant of P is ≤ 0: 4F (x, y) − 4 · F (x, x) · F (x, y)F (y, y) ≤ 0. That is, F (x, y) ≤ F (x, y)F (x, x)F (y, y), and this implies that F ...
Let κ be an U-invariant reproducing kernel and let H (κ) denote the reproducing kernel Hilbert C[z1, . . . , zd]-module associated with the kernel κ. Let Mz denote the d-tuple of multiplication operators Mz1 , . . . ,Mzd on H (κ). For a positive integer ν and d-tuple T = (T1, . . . , Td), consider the defect operator
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید