Comparison bewteen multi-task and single-task oracle risks in kernel ridge regression

نویسنده

  • Matthieu Solnon
چکیده

In this paper we study multi-task kernel ridge regression and try to understand when the multi-task procedure performs better than the single-task one, in terms of averaged quadratic risk. In order to do so, we compare the risks of the estimators with perfect calibration, the oracle risk. We are able to give explicit settings, favorable to the multi-task procedure, where the multi-task oracle performs better than the singletask one. In situations where the multi-task procedure is conjectured to perform badly, we also show the oracle does so. We then complete our study with simulated examples, where we can compare both oracle risks in more natural situations. A consequence of our result is that the multi-task ridge estimator has a lower risk than any single-task estimator, in favorable situations. MSC 2010 subject classifications: Primary 62H05; secondary 62C25, 62G08, 62J07, 68Q32.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Analysis of the oracle risk in multi-task ridge regression

In this talk1 we will present new results obtained in the multiple kernel ridge regression framework, which we refer to as multi-task regression. Multi-task techniques come into play when experimental limitations makes it impossible to increase the sample size n, which is the classical way to improve the performance of the estimators. However, it is often possible to have access to other closel...

متن کامل

Multi-task Regression using Minimal Penalties Multi-task Regression using Minimal Penalties

In this paper we study the kernel multiple ridge regression framework, which we refer to as multi-task regression, using penalization techniques. The theoretical analysis of this problem shows that the key element appearing for an optimal calibration is the covariance matrix of the noise between the different tasks. We present a new algorithm to estimate this covariance matrix, based on the con...

متن کامل

Stability of Multi-Task Kernel Regression Algorithms

We study the stability properties of nonlinear multi-task regression in reproducing Hilbert spaces with operator-valued kernels. Such kernels, a.k.a. multi-task kernels, are appropriate for learning problems with nonscalar outputs like multi-task learning and structured output prediction. We show that multi-task kernel regression algorithms are uniformly stable in the general case of infinite-d...

متن کامل

Multi-task regression using minimal penalties

In this paper we study the kernel multiple ridge regression framework, which we refer to as multi-task regression, using penalization techniques. The theoretical analysis of this problem shows that the key element appearing for an optimal calibration is the covariance matrix of the noise between the different tasks. We present a new algorithm to estimate this covariance matrix, based on the con...

متن کامل

Kernel PCA for Feature Extraction and De - Noising in 34 Nonlinear Regression

39 40 41 In this paper, we propose the application of the 42 Kernel Principal Component Analysis (PCA) tech43 nique for feature selection in a high-dimensional 44 feature space, where input variables are mapped by 45 a Gaussian kernel. The extracted features are 46 employed in the regression problems of chaotic 47 Mackey–Glass time-series prediction in a noisy 48 environment and estimating huma...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013