Regression with quadratic loss

نویسنده

  • Maxim Raginsky
چکیده

Regression with quadratic loss is another basic problem studied in statistical learning theory. We have a random couple Z = (X ,Y ), where, as before, X is anRd -valued feature vector (or input vector) and Y is the real-valued response (or output). We assume that the unknown joint distribution P = PZ = PX Y of (X ,Y ) belongs to some class P of probability distributions over Rd ×R. The learning problem, then, is to produce a predictor of Y given X on the basis of an i.i.d. training sample Z n = (Z1, . . . , Zn) = ((X1,Y1), . . . , (Xn ,Yn)) from P . A predictor is just a (measurable) function f : Rd →R, and we evaluate its performance by the expected quadratic loss L( f ), E[(Y − f (X ))]. As we have seen before, the smallest expected loss is achieved by the regression function f ∗(x), E[Y |X = x], i.e., L∗ , inf f L( f ) = L( f ∗) = E[(X −E[Y |X ])].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil

In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...

متن کامل

Sex difference of biological variation of BMI and waist circumference with age in Iranian adults: A predictive regression model

 Background: Background and Objectives: The biological variation of body mass index (BMI) and waist circumference (WC) with age may vary by gender. The objective of this study was to investigate the functional relationship of anthropometric measures with age and sex.Methods: The data were collected from a population-based cross-sectional study of 1800 men and 1800 women aged 20-70 years in...

متن کامل

Learning Rates for Classification with Gaussian Kernels

This letter aims at refined error analysis for binary classification using support vector machine (SVM) with gaussian kernel and convex loss. Our first result shows that for some loss functions, such as the truncated quadratic loss and quadratic loss, SVM with gaussian kernel can reach the almost optimal learning rate provided the regression function is smooth. Our second result shows that for ...

متن کامل

A Unified Loss Function in Bayesian Framework for Support Vector Regression

In this paper, we propose a unified non-quadratic loss function for regression known as soft insensitive loss function (SILF). SILF is a flexible model and possesses most of the desirable characteristics of popular non-quadratic loss functions, such as Laplacian, Huber’s and Vapnik’s ε-insensitive loss function. We describe the properties of SILF and illustrate our assumption on the underlying ...

متن کامل

Risk Bounds and Learning Algorithms for the Regression Approach to Structured Output Prediction

We provide rigorous guarantees for the regression approach to structured output prediction. We show that the quadratic regression loss is a convex surrogate of the prediction loss when the output kernel satisfies some condition with respect to the prediction loss. We provide two upper bounds of the prediction risk that depend on the empirical quadratic risk of the predictor. The minimizer of th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011