Statistical guarantees for regularized neural networks
نویسندگان
چکیده
Neural networks have become standard tools in the analysis of data, but they lack comprehensive mathematical theories. For example, there are very few statistical guarantees for learning neural from especially classes estimators that used practice or at least similar to such. In this paper, we develop a general guarantee consist least-squares term and regularizer. We then exemplify with ?1-regularization, showing corresponding prediction error increases most logarithmically total number parameters can even decrease layers. Our results establish basis regularized estimation networks, deepen our understanding deep more generally.
منابع مشابه
Regularized EM Algorithms: A Unified Framework and Statistical Guarantees
Latent models are a fundamental modeling tool in machine learning applications, but they present significant computational and analytical challenges. The popular EM algorithm and its variants, is a much used algorithmic tool; yet our rigorous understanding of its performance is highly incomplete. Recently, work in [1] has demonstrated that for an important class of problems, EM exhibits linear ...
متن کاملManifold Regularized Discriminative Neural Networks
Unregularized deep neural networks (DNNs) can be easily overfit with a limited sample size. We argue that this is mostly due to the disriminative nature of DNNs which directly model the conditional probability (or score) of labels given the input. The ignorance of input distribution makes DNNs difficult to generalize to unseen data. Recent advances in regularization techniques, such as pretrain...
متن کاملManifold regularized deep neural networks
Deep neural networks (DNNs) have been successfully applied to a variety of automatic speech recognition (ASR) tasks, both in discriminative feature extraction and hybrid acoustic modeling scenarios. The development of improved loss functions and regularization approaches have resulted in consistent reductions in ASR word error rates (WERs). This paper presents a manifold learning based regulari...
متن کاملResource allocation for statistical QoS guarantees in MIMO cellular networks
This work considers the performance of the downlink channel of MIMO cellular networks serving multiple users with different statistical QoS requirements. The paper proposes resource allocation algorithms that aim to optimize the system performance over the sum of the optimal user utility functions by employing the effective capacity theory. Proportionally fair resource allocation among the user...
متن کاملBitNet: Bit-Regularized Deep Neural Networks
We present a novel regularization scheme for training deep neural networks. The parameters of neural networks are usually unconstrained and have a dynamic range dispersed over the real line. Our key idea is to control the expressive power of the network by dynamically quantizing the range and set of values that the parameters can take. We formulate this idea using a novel end-to-end approach th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Networks
سال: 2021
ISSN: ['1879-2782', '0893-6080']
DOI: https://doi.org/10.1016/j.neunet.2021.04.034