A PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks

نویسندگان

  • Behnam Neyshabur
  • Srinadh Bhojanapalli
  • David McAllester
  • Nathan Srebro
چکیده

We present a generalization bound for feedforward neural networks in terms of the product of the spectral norms of the layers and the Frobenius norm of the weights. The generalization bound is derived using a PAC-Bayes analysis.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

PAC-Bayesian Margin Bounds for Convolutional Neural Networks - Technical Report

Recently the generalisation error of deep neural networks has been analysed through the PAC-Bayesian framework, for the case of fully connected layers. We adapt this approach to the convolutional setting.

متن کامل

Spectrally-normalized Margin Bounds for Neural Networks

We present a generalization bound for feedforward neural networks with ReLU activations in terms of the product of the spectral norm of the layers and the Frobenius norm of the weights. The key ingredient is a bound on the changes in the output of a network with respect to perturbation of its weights, thereby bounding the sharpness of the network. We combine this perturbation bound with the PAC...

متن کامل

On PAC-Bayesian Margin Bounds

Over the past few years, progress has been made in obtaining dimension independent margin bounds. In this note, we revisit the PAC-Bayesian margin bounds proposed by Langford and Shawe-Taylor [4] and later refined by McAllester [6]. In addition to simplifying some of the existing arguments, we use a tighter tail bound on the normal distribution to give an explicit margin bound that is a mild va...

متن کامل

Simplified PAC-Bayesian Margin Bounds

The theoretical understanding of support vector machines is largely based on margin bounds for linear classifiers with unit-norm weight vectors and unit-norm feature vectors. Unit-norm margin bounds have been proved previously using fat-shattering arguments and Rademacher complexity. Recently Langford and Shawe-Taylor proved a dimensionindependent unit-norm margin bound using a relatively simpl...

متن کامل

Generalisation Error Bounds for Sparse Linear Classi ers

We provide small sample size bounds on the generalisation error of linear classiiers that are sparse in their dual representation given by the expansion coeecients of the weight vector in terms of the training data. These results theoretically justify algorithms like the Support Vector Machine, the Relevance Vector Machine and K-nearest-neighbour. The bounds are a-posteriori bounds to be evalua...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1707.09564  شماره 

صفحات  -

تاریخ انتشار 2017