Spectrally-normalized Margin Bounds for Neural Networks

نویسندگان

  • Behnam Neyshabur
  • Srinadh Bhojanapalli
  • Nathan Srebro
چکیده

We present a generalization bound for feedforward neural networks with ReLU activations in terms of the product of the spectral norm of the layers and the Frobenius norm of the weights. The key ingredient is a bound on the changes in the output of a network with respect to perturbation of its weights, thereby bounding the sharpness of the network. We combine this perturbation bound with the PAC-Bayes analysis to derive the generalization bound.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks

We present a generalization bound for feedforward neural networks in terms of the product of the spectral norms of the layers and the Frobenius norm of the weights. The generalization bound is derived using a PAC-Bayes analysis.

متن کامل

Spectrally-normalized margin bounds for neural networks

This paper presents a margin-based multiclass generalization bound for neural networks that scales with their margin-normalized spectral complexity : their Lipschitz constant, meaning the product of the spectral norms of the weight matrices, times a certain correction factor. This bound is empirically investigated for a standard AlexNet network trained with SGD on the mnist and cifar10 datasets...

متن کامل

Large Margin Deep Neural Networks: Theory and Algorithms

Deep neural networks (DNN) have achieved huge practical success in recent years. However, its theoretical properties (in particular generalization ability) are not yet very clear, since existing error bounds for neural networks cannot be directly used to explain the statistical behaviors of practically adopted DNN models (which are multi-class in their nature and may contain convolutional layer...

متن کامل

PAC-Bayesian Margin Bounds for Convolutional Neural Networks - Technical Report

Recently the generalisation error of deep neural networks has been analysed through the PAC-Bayesian framework, for the case of fully connected layers. We adapt this approach to the convolutional setting.

متن کامل

Empirical Margin Distributions and Bounding the Generalization Error of Combined Classifiers

We prove new probabilistic upper bounds on generalization error of complex classifiers that are combinations of simple classifiers. Such combinations could be implemented by neural networks or by voting methods of combining the classifiers, such as boosting and bagging. The bounds are in terms of the empirical distribution of the margin of the combined classifier. They are based on the methods ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018