Correction to “Stability and Hypothesis Transfer Learning”

نویسندگان

  • Ilja Kuzborskij
  • Francesco Orabona
چکیده

There is an error in “Stability and Hypothesis Transfer Learning” (Kuzborskij & Orabona, 2013) which appeared in proceedings of ICML 2013. The Leave-One-Out generalization bound for Hypothesis Transfer Learning algorithm through Regularized Least Squares with biased regularization does not have the right convergence rate with respect to the regularization parameter λ and the source risk on the target domain, Rμ(f ′). This erratum describes the error and proves the correct generalization guarantees. The correct rate is in O ( 1 mλ1.5 ) , instead of incorrectly claimed O ( 1 mλ ) . However, the correct rate is still better than the usual one for Regularized Least Squares obtained via algorithmic stability analysis. Finally, corrected analysis still preserves the main contribution, that is, the relatedness of the source and target domains accelerates the convergence of the Leave-One-Out error to the generalization error. 1. Description of Error The error was committed in the proof of Theorem 3 (Kuzborskij & Orabona, 2013), where Lemma 2 was applied incorrectly. This rendered Theorem 2, proving generalization guarantees for Hypothesis Transfer Learning (HTL) algorithm analyzed, invalid. Theorem 3 proves an upper-bound on hypothesis stability (Bousquet & Elisseeff, 2002) with respect to the square loss, ∀i ∈ {1, . . . ,m}, ES,(x,y) [∣∣(fS(x)− y) − (fS\i(x)− y)2∣∣] ≤ γ . Here (x, y) assumed to be any example drawn i.i.d. from p.d.f. μ. However, Lemma 2 proves an upper bound on the quantity (fS\i(xi)−yi), where (xi, yi) ∈ S, in other words, belongs to the training set. That is, in Theorem 3, (fS\i(xi) − yi) appears instead of (fS\i(x)− y). To use the correct quantity (fS\i(x)−y), we first obtain a closed form solution to fS\i(x), considering linear hypotheses f(x) := x>w and fS\i(x) := x wS\i . The derivation is given in Lemma 1. A very similar error also appears in the proof of Lemma 4 (Kuzborskij & Orabona, 2013), first result. The nature of error is the same, and we fix it in the proof of the Theorem 1. Lemma 1. Let wS be the hypothesis produced by the Regularized Least Squares (RLS) algorithm given training set S. For any sample (x, y) i.i.d. ∼ μ and (xi, yi) ∈ S, such that ‖x‖, ‖xi‖ ≤ 1, we have that the hypothesis wS\i produced by the same RLS algorithm on a training set S\i ∀i ∈ {1, . . . ,m}, satisfies |xwS − xwS\i | ≤ 1 mλ |xi wS\i − yi| . Proof. Define X = [x1, . . . ,xi−1,xi+1, . . . ,xm], M = X>X +mλI . It is straightforward to see that xwS is equal to [ x>X xxi ] [ M Xxi xi X ‖xi‖ +mλ ]−1 [ y yi ] . (1) Expanding the middle term and using the block-wise matrix inversion property (Petersen & Pedersen, 2008) we get [ M Xxi xi X ‖xi‖ +mλ ]−1 = [ M−1 0 0> 0 ] + 1 a [ MXxi −1 ] [ xi XM −1 −1 ] , where a := ‖xi‖ + mλ− xi XM Xxi. Plugging Correction to “Stability and Hypothesis Transfer Learning” this result into (1) yields xwS = x wS\i+ x> ( I −XM−1X> ) xi a (yi − xi wS\i) . Using result of Lemma 2, we have that mλ ≤ a and in addition by Cauchy-Schwarz inequality we have that x ( I −XM−1X> ) xi ≤ 1, since ‖x‖, ‖xi‖ ≤ 1. Lemma 2. For all X ∈ Rm×d, m, λ ≥ 0, we have that the matrix I −X ( X>X +mλI )−1 X> is PSD and its maximum eigenvalue is less than 1.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Concept Revision of Age, Motivation, and Error Correction in Second Language Learning

The current review article investigates some variables contributing to English language teaching and learning. Three factors of age, motivation and error correction have been of importance in English language curricula in language centres. Some studies have been conducted to investigate various effects of these three components on English language acquisition, those studies, however, may lack d...

متن کامل

Cross-Linguistic Transfer or Target Language Proficiency: Writing Performance of Trilinguals vs. Bilinguals in Relation to the Interdependence Hypothesis

This study explored the nature of transfer among bilingual vs. trilinguals with varying levels of competence in English and their previous languages. The hypotheses were tested in writing tasks designed for 75 high (N= 35) vs. intermediate (N=40) proficient EFL learners with Turkish, Persian, English and Persian, English linguistic backgrounds. Qualitative data were also collected through some ...

متن کامل

Stability and Hypothesis Transfer Learning

We consider the transfer learning scenario, where the learner does not have access to the source domain directly, but rather operates on the basis of hypotheses induced from it – the Hypothesis Transfer Learning (HTL) problem. Particularly, we conduct a theoretical analysis of HTL by considering the algorithmic stability of a class of HTL algorithms based on Regularized Least Squares with biase...

متن کامل

I-18: Techniques and Technologies for Embryo Transfer: Does It Really Matter.

The learning objectives of this presentation are to understand the dynamics involved in the process of ET, evaluate the evidence for/against common practices and techniques and develop a standardized ET process in view of supporting evidence. Gametes and embryos are handled with extreme care at every step of the Laboratory process. ET is the least sophisticated step of the in vitro Fertilizatio...

متن کامل

Sample Selection Bias Correction Theory

This paper presents a theoretical analysis of sample selection bias correction. The sample bias correction technique commonly used in machine learning consists of reweighting the cost of an error on each training point of a biased sample to more closely reflect the unbiased distribution. This relies on weights derived by various estimation techniques based on finite samples. We analyze the effe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014