Regularizing Neural Networks via Retaining Confident Connections
نویسندگان
چکیده
منابع مشابه
Regularizing Neural Networks via Retaining Confident Connections
Regularization of neural networks can alleviate overfitting in the training phase. Current regularization methods, such as Dropout and DropConnect, randomly drop neural nodes or connections based on a uniform prior. Such a data-independent strategy does not take into consideration of the quality of individual unit or connection. In this paper, we aim to develop a data-dependent approach to regu...
متن کاملRegularizing Neural Networks by Penalizing Confident Output Distributions
We systematically explore regularizing neural networks by penalizing low entropy output distributions. We show that penalizing low entropy output distributions, which has been shown to improve exploration in reinforcement learning, acts as a strong regularizer in supervised learning. Furthermore, we connect a maximum entropy based confidence penalty to label smoothing through the direction of t...
متن کاملRegularizing Deep Neural Networks by Noise: Its Interpretation and Optimization
Overfitting is one of the most critical challenges in deep neural networks, and there are various types of regularization methods to improve generalization performance. Injecting noises to hidden units during training, e.g., dropout, is known as a successful regularizer, but it is still not clear enough why such training techniques work well in practice and how we can maximize their benefit in ...
متن کاملMentorNet: Regularizing Very Deep Neural Networks on Corrupted Labels
Recent studies have discovered that deep networks are capable of memorizing the entire data even when the labels are completely random. Since deep models are trained on big data where labels are often noisy, the ability to overfit noise can lead to poor performance. To overcome the overfitting on corrupted training data, we propose a novel technique to regularize deep networks in the data dimen...
متن کاملConnections between Neural Networks and Boolean Functions∗
This report surveys some connections between Boolean functions and artificial neural networks. The focus is on cases in which the individual neurons are linear threshold neurons, sigmoid neurons, polynomial threshold neurons, or spiking neurons. We explore the relationships between types of artificial neural network and classes of Boolean function. In particular, we investigate the type of Bool...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Entropy
سال: 2017
ISSN: 1099-4300
DOI: 10.3390/e19070313