Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks
نویسندگان
چکیده
Large multilayer neural networks trained with backpropagation have recently achieved state-ofthe-art results in a wide range of problems. However, using backprop for neural net learning still has some disadvantages, e.g., having to tune a large number of hyperparameters to the data, lack of calibrated probabilistic predictions, and a tendency to overfit the training data. In principle, the Bayesian approach to learning neural networks does not have these problems. However, existing Bayesian techniques lack scalability to large dataset and network sizes. In this work we present a novel scalable method for learning Bayesian neural networks, called probabilistic backpropagation (PBP). Similar to classical backpropagation, PBP works by computing a forward propagation of probabilities through the network and then doing a backward computation of gradients. A series of experiments on ten real-world datasets show that PBP is significantly faster than other techniques, while offering competitive predictive abilities. Our experiments also show that PBP provides accurate estimates of the posterior variance on the network weights.
منابع مشابه
Assumed Density Filtering Methods for Learning Bayesian Neural Networks
Buoyed by the success of deep multilayer neural networks, there is renewed interest in scalable learning of Bayesian neural networks. Here, we study algorithms that utilize recent advances in Bayesian inference to efficiently learn distributions over network weights. In particular, we focus on recently proposed assumed density filtering based methods for learning Bayesian neural networks – Expe...
متن کاملLearning Structured Weight Uncertainty in Bayesian Neural Networks
Deep neural networks (DNNs) are increasingly popular in modern machine learning. Bayesian learning affords the opportunity to quantify posterior uncertainty on DNN model parameters. Most existing work adopts independent Gaussian priors on the model weights, ignoring possible structural information. In this paper, we consider the matrix variate Gaussian (MVG) distribution to model structured cor...
متن کاملProbabilistic Contaminant Source Identification in Water Distribution Infrastructure Systems
Large water distribution systems can be highly vulnerable to penetration of contaminant factors caused by different means including deliberate contamination injections. As contaminants quickly spread into a water distribution network, rapid characterization of the pollution source has a high measure of importance for early warning assessment and disaster management. In this paper, a methodology...
متن کاملA Position Paper on Statistical Inference Techniques Which Integrate Neural Network and Bayesian Network Models
Some statistical methods which have been shown to have direct neural network analogs are surveyed here; we discuss sampling, optimization, and representation methods which make them feasible when applied in conjunction with, or in place of, neural networks. We present the foremost of these, the Gibbs sampler, both in its successful role as a convergence heuristic derived from statistical physic...
متن کاملDeep Gaussian Processes for Regression using Approximate Expectation Propagation
Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (GPs) and are formally equivalent to neural networks with multiple, infinitely wide hidden layers. DGPs are nonparametric probabilistic models and as such are arguably more flexible, have a greater capacity to generalise, and provide better calibrated uncertainty estimates than alternative deep mod...
متن کامل