On the VC-dimension and Boolean functions with long runs

نویسنده

  • Joel Ratsaby
چکیده

The Vapnik-Chervonenkis (VC) dimension and the Sauer-Shelah lemma have found applications in numerous areas including set theory, combinatorial geometry, graph theory and statistical learning theory. Estimation of the complexity of discrete structures associated with the search space of algorithms often amounts to estimating the cardinality of a simpler class which is effectively induced by some restrictive property of the search. In this paper we study the complexity of Boolean-function classes of finite VC-dimension which satisfy a local ‘smoothness’ property expressed as having long runs of repeated values. As in Sauer’s lemma, a bound is obtained on the cardinality of such classes.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Boolean Functions: Cryptography and Applications

Abstract. The Vapnik-Chervonenkis (VC) dimension and the Sauer-Shelah lemma have found applications in numerous areas including set theory, combinatorial geometry, graph theory and statistical learning theory. Estimation of the complexity of discrete structures associated with the search space of algorithms often amounts to estimating the cardinality of a simpler class which is effectively indu...

متن کامل

Complexity of VC-classes of sequences with long repetitive runs

The Vapnik-Chervonenkis (VC) dimension (also known as the trace number) and the Sauer-Shelah lemma have found applications in numerous areas including set theory, combinatorial geometry, graph theory and statistical learning theory. Estimation of the complexity of discrete structures associated with the search space of algorithms often amounts to estimating the cardinality of a simpler class wh...

متن کامل

Getting More Randomness

Theorem 1.1 (Number of Samples for Class with Bounded VC-Dimension) Suppose (X,C) has VC-dimension at most d. Then, suppose S is a subset obtained by sampling from X independently m times. If m ≥ max{ log 2 δ , 8d log 8d }, then with probability at least 1− δ, S is an -net. Using VC-dimension, we can bound the number of effective boolean functions on the random subset S. However, after conditio...

متن کامل

Bounds for the Computational

It is shown that feedforward neural nets of constant depth with piecewise polynomial activation functions and arbitrary real weights can be simulated for boolean inputs and outputs by neural nets of a somewhat larger size and depth with heaviside gates and weights from f0; 1g. This provides the rst known upper bound for the computational power and VC-dimension of such neural nets. It is also sh...

متن کامل

Classiication by Polynomial Surfaces

Linear threshold functions (for real and Boolean inputs) have received much attention, for they are the component parts of many artiicial neural networks. Linear threshold functions are exactly those functions such that the positive and negative examples are separated by a hyperplane. One extension of this notion is to allow separators to be surfaces whose equations are polynomials of at most a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006