نتایج جستجو برای: logarithmic kernel
تعداد نتایج: 69050 فیلتر نتایج به سال:
In this paper we present a method of studying convolution operator under the Sonin conditions imposed on kernel. The particular case kernel is fractional integral Riemman–Liouville operator, other various types kernels are Bessel-type function, functions with power-logarithmic singularities at origin e.t.c. We pay special attention to study close power type functions. main our aim Sonin–Abel eq...
Novelty is an important psychological construct that affects both perceptual and behavioral processes. Here, we propose a lexical novelty score (LNS) for a song’s lyric, based on the statistical properties of a corpus of 275,905 lyrics (available at www.smcnus.org/lyrics/). A lyric-level LNS was derived as a function of the inverse document frequencies of its unique words. An artist-level LNS w...
This paper presents a novel learning scenario which combines dimensionality reduction, supervised learning as well as kernel selection. We carefully define the hypothesis class that addresses this setting and provide an analysis of its Rademacher complexity and thereby provide generalization guarantees. The proposed algorithm uses KPCA to reduce the dimensionality of the feature space, i.e. by ...
Heat-kernel expansion and zeta function regularisation are discussed for Laplace type operators with discrete spectrum on non compact domains. Since a general theory is lacking, the heat-kernel expansion is investigated by means of several examples. Generically, it is pointed out that for a class of exponential (analytic) interactions, the non compactness of the domain gives rise to logarithmic...
Kernel ridge regression (KRR) is a standard method for performing non-parametric regression over reproducing kernel Hilbert spaces. Given n samples, the time and space complexity of computing the KRR estimate scale as O(n3) and O(n2) respectively, and so is prohibitive in many cases. We propose approximations of KRR based on m-dimensional randomized sketches of the kernel matrix, and study how ...
We propose two new generalization error bounds for multiple kernel learning (MKL). First, using the bound of Srebro and BenDavid (2006) as a starting point, we derive a new version which uses a simple counting argument for the choice of kernels in order to generate a tighter bound when 1-norm regularization (sparsity) is imposed in the kernel learning problem. The second bound is a Rademacher c...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید