Learning with side information: PAC learning bounds

نویسندگان
چکیده

منابع مشابه

PAC-Bayesian learning bounds

with respect to θ ∈ Θ. As we assume that N is potentially very large, we will draw at random some independent identically distributed sample (W1, . . . ,Wn) according to the uniform distribution on { w1, . . . , wN } , where the size n of the statistical sample corresponds to the amount of computations we are ready to make. This sample will be used to choose the parameters. Although in this sce...

متن کامل

Analysis of Complexity Bounds Random Sets for Pac-learning With

Learnability in Valiant’s pac-learning formalism is reformulated in terms of expected (average) error instead of confidence and error parameters. A finite-domain, random set formalism is introduced to develop algorithm-dependent, distributionspecific analytic error estimates. Two random set theorems for finite concept-spaces are presented to facilitate these developments. Analyses are carried o...

متن کامل

Learning Preferences with Side Information

Product and content personalization is now ubiquitous in e-commerce. There is typically too little available transactional data for this task. As such, companies today seek to use a variety of information on the interactions between a product and a customer to drive personalization decisions. We formalize this problem as one of recovering a large-scale matrix, with side information in the form ...

متن کامل

General Bounds on Statistical Query Learning and PAC Learning with Noise via Hypothesis Bounding

We derive general bounds on the complexity of learning in the Statistical Query model and in the PAC model with classification noise. We do so by considering the problem of boosting the accuracy of weak learning algorithms which fall within the Statistical Query model. This new model was introduced by Kearns [12] to provide a general framework for efficient PAC learning in the presence of class...

متن کامل

Learning with Side Information: Part I

We explore a new learning setting, in which each randomly generated sample gives rise to an additional deterministic sample, called side information, that is also classified by the oracle. Hence a learning algorithm utilizing side information chooses from a smaller and more accurate set of concepts, and is expected to operate more efficiently. In general, side information learning utilizes depe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computer and System Sciences

سال: 2004

ISSN: 0022-0000

DOI: 10.1016/j.jcss.2003.07.005