Introduction to Markov Chain Monte Carlo
نویسنده
چکیده
Despite a few notable uses of simulation of random processes in the pre-computer era (Hammersley and Handscomb, 1964, Section 1.2; Stigler, 2002, Chapter 7), practical widespread use of simulation had to await the invention of computers. Almost as soon as computers were invented, they were used for simulation (Hammersley and Handscomb, 1964, Section 1.2). The name “Monte Carlo” started as cuteness—gambling was then (around 1950) illegal in most places, and the casino at Monte Carlo was the most famous in the world—but it soon became a colorless technical term for simulation of random processes. Markov chain Monte Carlo (MCMC) was invented soon after ordinary Monte Carlo at Los Alamos, one of the few places where computers were available at the time. Metropolis et al. (1953)∗ simulated a liquid in equilibrium with its gas phase. The obvious way to find out about the thermodynamic equilibrium is to simulate the dynamics of the system, and let it run until it reaches equilibrium. The tour de force was their realization that they did not need to simulate the exact dynamics; they only needed to simulate some Markov chain having the same equilibrium distribution. Simulations following the scheme of Metropolis et al. (1953) are said to use the Metropolis algorithm. As computers became more widely available, the Metropolis algorithm was widely used by chemists and physicists, but it did not become widely known among statisticians until after 1990. Hastings (1970) generalized the Metropolis algorithm, and simulations following his scheme are said to use the Metropolis–Hastings algorithm. A special case of the Metropolis–Hastings algorithm was introduced by Geman and Geman (1984), apparently without knowledge of earlier work. Simulations following their scheme are said to use the Gibbs sampler. Much of Geman and Geman (1984) discusses optimization to find the posterior mode rather than simulation, and it took some time for it to be understood in the spatial statistics community that the Gibbs sampler simulated the posterior distribution, thus enabling full Bayesian inference of all kinds. A methodology that was later seen to be very similar to the Gibbs sampler was introduced by Tanner and Wong (1987), again apparently without knowledge of earlier work. To this day, some refer to the Gibbs sampler as “data augmentation” following these authors. Gelfand and Smith (1990) made the wider Bayesian community aware of the Gibbs sampler, which up to that time had been known only in the spatial statistics community. Then it took off; as of this writing, a search for Gelfand and Smith (1990) on Google Scholar yields 4003 links to other works. It was rapidly realized that most Bayesian inference could
منابع مشابه
Markov Chain Monte Carlo
This paper gives a brief introduction to Markov Chain Monte Carlo methods, which offer a general framework for calculating difficult integrals. We start with the basic theory of Markov chains and build up to a theorem that characterizes convergent chains. We then discuss the MetropolisHastings algorithm.
متن کاملA Stochastic algorithm to solve multiple dimensional Fredholm integral equations of the second kind
In the present work, a new stochastic algorithm is proposed to solve multiple dimensional Fredholm integral equations of the second kind. The solution of the integral equation is described by the Neumann series expansion. Each term of this expansion can be considered as an expectation which is approximated by a continuous Markov chain Monte Carlo method. An algorithm is proposed to sim...
متن کاملAn Introduction to Bayesian Techniques for Sensor Networks
The purpose of this paper is threefold. First, it briefly introduces basic Bayesian techniques with emphasis on present applications in sensor networks. Second, it reviews modern Bayesian simulation methods, thereby providing an introduction to the main building blocks of the advanced Markov chain Monte Carlo and Sequential Monte Carlo methods. Lastly, it discusses new interesting research hori...
متن کامل