On MCMC Sampling in Bayesian MLP Neural Networks
نویسندگان
چکیده
Bayesian MLP neural networks are a flexible tool in complex nonlinear problems. The approach is complicated by need to evaluate integrals over high-dimensional probability distributions. The integrals are generally approximated with Markov Chain Monte Carlo (MCMC) methods. There are several practical issues which arise when implementing MCMC. This article discusses the choice of starting values and the number of chains in Bayesian MLP models. We propose a new method for choosing the starting values based on early stopping and we demonstrate the benefits of using several independent chains.
منابع مشابه
Bayesian neural networks with correlating residuals
Usually in multivariate regression problem it is assumed that residuals of outputs are independent of each other. In many applications a more realistic model would allow dependencies between the outputs. In this paper we show how a Bayesian treatment using Markov Chain Monte Carlo (MCMC) method can allow for a full covariance matrix with Multi Layer Perceptron (MLP) neural networks.
متن کاملLearning Deep Generative Models with Doubly Stochastic MCMC
We present doubly stochastic gradient MCMC, a simple and generic method for (approximate) Bayesian inference of deep generative models in the collapsed continuous parameter space. At each MCMC sampling step, the algorithm randomly draws a minibatch of data samples to estimate the gradient of log-posterior and further estimates the intractable expectation over latent variables via a Gibbs sample...
متن کاملNovel Application of Multi-Layer Perceptrons (MLP) neural networks to model HIV in South Africa using seroprevalence data from
This paper presents an application of Multi-layer Perceptrons (MLP) neural networks to model the demographic characteristics of antenatal clinic attendees in South Africa. The method of cross-validation is used to examine the betweensample variation of neural networks for HIV prediction. MLP neural networks for classifying both the HIV negative and positive clinic attendees are developed and ev...
متن کاملBayesian Input Variable Selection Using Posterior Probabilities and Expected Utilities
We consider the input variable selection in complex Bayesian hierarchical models. Our goal is to find a model with the smallest number of input variables having statistically or practically at least the same expected utility as the full model with all the available inputs. A good estimate for the expected utility can be computed using cross-validation predictive densities. In the case of input ...
متن کاملآموزش شبکه عصبی MLP در فشردهسازی تصاویر با استفاده از روش GSA
Image compression is one of the important research fields in image processing. Up to now, different methods are presented for image compression. Neural network is one of these methods that has represented its good performance in many applications. The usual method in training of neural networks is error back propagation method that its drawbacks are late convergence and stopping in points of lo...
متن کامل