Error analysis of regularized least-square regression with Fredholm kernel
نویسندگان
چکیده
منابع مشابه
Error analysis of regularized least-square regression with Fredholm kernel
Learning with Fredholm kernel has attracted increasing attention recently since it can effectively utilize the data information to improve the prediction performance. Despite rapid progress on theoretical and experimental evaluations, its generalization analysis has not been explored in learning theory literature. In this paper, we establish the generalization bound of least square regularized ...
متن کاملReproducing Kernel Banach Spaces with the ℓ1 Norm II: Error Analysis for Regularized Least Square Regression
A typical approach in estimating the learning rate of a regularized learning scheme is to bound the approximation error by the sum of the sampling error, the hypothesis error and the regularization error. Using a reproducing kernel space that satisfies the linear representer theorem brings the advantage of discarding the hypothesis error from the sum automatically. Following this direction, we ...
متن کاملRegularized Kernel Recursive Least Square Algoirthm
In most adaptive signal processing applications, system linearity is assumed and adaptive linear filters are thus used. The traditional class of supervised adaptive filters rely on error-correction learning for their adaptive capability. The kernel method is a powerful nonparametric modeling tool for pattern analysis and statistical signal processing. Through a nonlinear mapping, kernel methods...
متن کاملLearning Rates of Least-Square Regularized Regression
This paper considers the regularized learning algorithm associated with the leastsquare loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and the capacity of the reproducing kernel Hilbert...
متن کاملImage classification using kernel collaborative representation with regularized least square
Sparse representation based classification (SRC) has received much attention in computer vision and pattern recognition. SRC codes a testing sample by sparse linear combination of all the training samples and classifies the testing sample into the class with the minimum representation error. Recently, Zhang analyzes the working mechanism of SRC and points out that it is the collaborative repres...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neurocomputing
سال: 2017
ISSN: 0925-2312
DOI: 10.1016/j.neucom.2017.03.076