Stan: A Probabilistic Programming Language
نویسندگان
چکیده
Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.2.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectation propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can be called from the command line, through R using the RStan package, or through Python using the PyStan package. All three interfaces support sampling or optimization-based inference and analysis, and RStan and PyStan also provide access to log probabilities, gradients, Hessians, and data I/O.
منابع مشابه
Stan: A probabilistic programming language for Bayesian inference and optimization∗
Abstract Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia, and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users’ and developers’ perspective and illustrate with a simple but ...
متن کاملA tutorial on fitting Bayesian linear mixed models using Stan
With the arrival of the R packages nlme and lme4, linear mixed models (LMMs) have come to be widely used in psychology, cognitive science, and related areas. In this tutorial, we provide a practical introduction to fitting LMMs in a Bayesian framework using the probabilistic programming language Stan. Although the Bayesian framework has several important advantages, specifying a Bayesian model ...
متن کاملDeep Probabilistic Programming
We propose Edward, a Turing-complete probabilistic programming language. Edward builds on two compositional representations—random variables and inference. By treating inference as a first class citizen, on a par with modeling, we show that probabilistic programming can be as flexible and computationally efficient as traditional deep learning. For flexibility, Edward makes it easy to fit the sa...
متن کاملSWIFT: Compiled Inference for Probabilistic Programs
One long-term goal for research on probabilistic programming languages (PPLs) is efficient inference using a single, generic inference engine. Many current inference engines are, however, interpreters for the given PP, leading to substantial overhead and poor performance. This paper describes a PPL compiler, Swift, that generates model-specific and inference-algorithm-specific target code from ...
متن کاملProbabilistic Programming in Julia New Inference Algorithms
In this thesis we look at the design and development of a Probabilistic Programming Language (PPL) in Julia named Turing and the challenges of implementing the Hamiltonian Monte Carlo (HMC) sampler inside the Turing framework. This dissertation starts with a review of three important fields behind the project, which are Bayesian inference, general inference algorithms and probabilistic programm...
متن کامل