Learning faster than promised by the Vapnik-Chervonenkis dimension

نویسندگان

  • Anselm Blumer
  • Nick Littlestone
چکیده

We investigate the sample size needed to infera separating line between IWO I’CIII~C~ planar region> usiny Valiant’s model of the complesity of learning t‘rom I-andom ewmples [4]. A rheorem proved III II] using the Vapnlk~C‘hervonenki~ dimension give, an O((I /c)ln( 1 IF)) upper hound on the sample size sufficient to infer a separating line with error lehs than E between two COTIVSX planar regions. This theorem requires that wth high probability uti_v separating hne CCIIIsilent wiO1 such a sample have hmall CI-rm. The preseni paper gives a lower hound showmg that under this requil-ement the sample size cannot bc irnpro\cd. II i$ further +hrwn that if this requirement is weakened to require only that a particular line which ii Larigenl to Ihe cun\;~x hull+ of the hampIe points m the two regions have Tmall error then the In(l,‘c) trim can be eliminated flon~ the upper borlnd.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Error Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions

The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer’s Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the dis...

متن کامل

Quantifying Generalization in Linearly Weighted Neural Networks

Abst ract . Th e Vapn ik-Chervonenkis dimension has proven to be of great use in the theoret ical study of generalizat ion in artificial neural networks. Th e "probably approximately correct" learning framework is described and the importance of the Vapnik-Chervonenkis dimension is illustrated. We then investigate the Vapnik-Chervonenkis dimension of certain types of linearly weighted neural ne...

متن کامل

2 Notes on Classes with Vapnik-Chervonenkis Dimension 1

The Vapnik-Chervonenkis dimension is a combinatorial parameter that reflects the ”complexity” of a set of sets (a.k.a. concept classes). It has been introduced by Vapnik and Chervonenkis in their seminal paper [1] and has since found many applications, most notably in machine learning theory and in computational geometry. Arguably the most influential consequence of the VC analysis is the funda...

متن کامل

VC Dimension of Neural Networks

This paper presents a brief introduction to Vapnik-Chervonenkis (VC) dimension, a quantity which characterizes the difficulty of distribution-independent learning. The paper establishes various elementary results, and discusses how to estimate the VC dimension in several examples of interest in neural network theory.

متن کامل

The Vapnik-Chervonenkis Dimension: Information versus Complexity in Learning

These questions contrast the roles of information and complexity in learning. While the two roles share some ground, they are conceptually and technically different. In the common language of learning, the information question is that of generalization and the complexity question is that of scaling. The work of Vapnik and Chervonenkis (1971) provides the key tools for dealing with the informati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Discrete Applied Mathematics

دوره 24  شماره 

صفحات  -

تاریخ انتشار 1989