نتایج جستجو برای: monte carlo integration

تعداد نتایج: 292262  

Journal: :Math. Comput. 2002
Ian H. Sloan Frances Y. Kuo Stephen Joe

We develop and justify an algorithm for the construction of quasi– Monte Carlo (QMC) rules for integration in weighted Sobolev spaces; the rules so constructed are shifted rank-1 lattice rules. The parameters characterising the shifted lattice rule are found “component-by-component”: the (d + 1)-th component of the generator vector and the shift are obtained by successive 1-dimensional searches...

2006
Zhiqiang TAN Z. TAN

This article considers Monte Carlo integration under rejection sampling or Metropolis-Hastings sampling. Each algorithm involves accepting or rejecting observations from proposal distributions other than a target distribution. While taking a likelihood approach, we basically treat the sampling scheme as a random design, and define a stratified estimator of the baseline measure. We establish tha...

2008
Zhiqiang Tan

There are two conceptually distinct tasks in Markov chain Monte Carlo (MCMC): a sampler is designed for simulating a Markov chain and then an estimator is constructed on the Markov chain for computing integrals and expectations. In this article, we aim to address the second task by extending the likelihood approach of Kong et al. for Monte Carlo integration. We consider a general Markov chain s...

2015
R. N. Gantner Robert N. Gantner Christoph Schwab

The efficient construction of higher-order interlaced polynomial lattice rules introduced recently in [6] is considered and the computational performance of these higher-order QMC rules is investigated on a suite of parametric, highdimensional test integrand functions. After reviewing the principles of their construction by the “fast component-by-component” (CBC) algorithm due to Nuyens and Coo...

Journal: :J. Complexity 1999
Stefan Heinrich Eugène Sindambiwe

The Monte Carlo complexity of computing integrals depending on a parameter is analyzed for smooth integrands. An optimal algorithm is developed on the basis of a multigrid variance reduction technique. The complexity analysis implies that our algorithm attains a higher convergence rate than any deterministic algorithm. Moreover, because of savings due to computation on multiple grids, this rate...

2015
Wei-Lun Chao Justin Solomon Dominik Ludewig Michels Fei Sha

We investigate numerical integration of ordinary differential equations (ODEs) for Hamiltonian Monte Carlo (HMC). High-quality integration is crucial for designing efficient and effective proposals for HMC. While the standard method is leapfrog (Störmer-Verlet) integration, we propose the use of an exponential integrator, which is robust to stiff ODEs with highly-oscillatory components. This os...

Journal: :Journal of computational chemistry 2006
Genyuan Li Herschel Rabitz

The High-Dimensional Model Representation (HDMR) technique is a family of approaches to efficiently interpolate high-dimensional functions. RS(Random Sampling)-HDMR is a practical form of HDMR based on randomly sampling the overall function, and utilizing orthonormal polynomial expansions to approximate the RS-HDMR component functions. The determination of the expansion coefficients for the com...

2004
Ch. Schlier

Several test functions, whose variation could be calculated, were integrated with up tp 10 trials using different low-discrepancy sequences in dimensions 3, 6, 12, and 24. The integration errors divided by the variation of the functions were compared with exact and asymptotic discrepancies. These errors follow an approximate power law, whose constant is essentially given by the variance of the ...

2004
PETER MATHÉ GANG WEI

In this paper we show that a wide class of integrals over Rd with a probability weight function can be evaluated using a quasi–Monte Carlo algorithm based on a proper decomposition of the domain Rd and arranging low discrepancy points over a series of hierarchical hypercubes. For certain classes of power/exponential decaying weights the algorithm is of optimal order.

2009
Avi Kak Avinash Kak

Prologue The goal of this tutorial presentation is to focus on the pervasiveness of Monte-Carlo integration and importance sampling in Bayesian estimation, in general, and in particle filtering, in particular. This tutorial is a continuation of my tutorial: " ML, MAP, and Bayesian — The Holy Trinity of Parameter Estimation and Data Prediction " that can be downloaded from:

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید