Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Metropolis-Hastings algorithm

The Metropolis-Hastings algorithm is the most general form of the MCMC processes. It is also the easiest to conceptualize and code. An example of pseudocode is given in the five-step process below. The Markov chain process is clearly shown in the code, where samples that are generated from the prior distribution are accepted as arising from the posterior distribution at the ratio of the probability of the joint... [Pg.141]

Thus, we take advantage of the accuracy, robustness and efficiency of the direct problem solution, to tackle the associated inverse heat transfer problem analysis [26, 27] towards the simultaneous estimation of momentum and thermal accommodation coefficients in micro-channel flows with velocity slip and temperature jump. A Bayesian inference approach is adopted in the solution of the identification problem, based on the Monte Carlo Markov Chain method (MCMC) and the Metropolis-Hastings algorithm [28-30]. Only simulated temperature measurements at the external faces of the channel walls, obtained for instance via infrared thermography [30], are used in the inverse analysis in order to demonstrate the capabilities of the proposed approach. A sensitivity analysis allows for the inspection of the identification problem behavior when the external wall Biot number is also included among the parameters to be estimated. [Pg.40]

The Metropolis-Hastings algorithm uses an auxiliary probability density function, (P P), from which it is easy to obtain sample values. Assuming that the chain is in a state P, a new candidate value, P, is generated from the auxiliary distribution (P P), given the current state of the chain P. [Pg.47]

In practical terms, this means that the simulation of a sample of / (P Y) using the Metropolis-Hastings algorithm can be outlined as follows [29] ... [Pg.47]

In the third step, the updated value(s), , are assigned as with probability equal to the minimum of r and 1 and, otherwise. Thus, the algorithm always accepts steps that increase the density and occasionally those that decrease it. The advantage to the asymmetric jumping rules in the Metropolis-Hastings algorithm is an increase in speed. [Pg.240]

Adaptive Markov Chain Monte Carlo Simulation 2.5.3.1 Metropolis-Hastings Algorithm... [Pg.50]

The variational quantum Monte Carlo method (VMC) is both simpler and more efficient than the DMC method, but also usually less accurate. In this method the Rayleigh-Ritz quotient for a trial function 0 is evaluated with Monte Carlo integration. The Metropolis-Hastings algorithm " is used to sample the distribution... [Pg.242]

Another simulation approach often used that does not offer a simple deterministic time evolution of the system is the Metropolis Monte Carlo [MMC] method. Based on the Metropolis-Hasting algorithm [3, 4], MMC methods are weighted sampling techniques in which particles are randomly moved about to obtain a statistical ensemble of atoms with a particular probability distribution for some quantity. This is usually the energy but can also be other quantities such as experimental inputs that can be quickly calculated from an atomic... [Pg.145]

Chiband, S. Greenberg, E. 1995. Understanding the Metropolis-Hastings Algorithm. The American Statistician, 49(4) 327-335. [Pg.1309]

In Chapter 7 we develop a method for finding a Markov chain that has good mixing properties. We will use the Metropolis-Hastings algorithm with heavy-tailed independent candidate density. We then discuss the problem of statistical inference on the sample from the Markov chain. We want to base our inferences on an approximately random sample from the posterior. This requires that we determine... [Pg.21]

In Section 6.1 we show how the Metropolis-Hastings algorithm can be used to find a Markov chain that has the posterior as its long-run distribution in the case of a single parameter. There are two kinds of candidate densities we can use, random-walk candidate densities, or independent candidate densities. We see how the chain... [Pg.128]

In the blockwise Metropolis-Hastings algorithm the candidate density for the block of parameters Oj given all the other parameters 0-j and the data y must dominate the true conditional density in the tails. That is... [Pg.148]


See other pages where Metropolis-Hastings algorithm is mentioned: [Pg.247]    [Pg.24]    [Pg.252]    [Pg.45]    [Pg.47]    [Pg.240]    [Pg.240]    [Pg.240]    [Pg.244]    [Pg.247]    [Pg.269]    [Pg.50]    [Pg.682]    [Pg.278]    [Pg.280]    [Pg.2036]    [Pg.20]    [Pg.21]    [Pg.22]    [Pg.22]    [Pg.42]    [Pg.129]    [Pg.130]    [Pg.131]    [Pg.135]    [Pg.137]    [Pg.137]    [Pg.139]    [Pg.140]    [Pg.141]    [Pg.143]    [Pg.144]    [Pg.144]    [Pg.145]    [Pg.150]   
See also in sourсe #XX -- [ Pg.141 , Pg.142 , Pg.252 ]

See also in sourсe #XX -- [ Pg.40 , Pg.45 , Pg.47 ]

See also in sourсe #XX -- [ Pg.50 ]

See also in sourсe #XX -- [ Pg.265 , Pg.269 ]




SEARCH



HASTE

Metropolis

Metropolis algorithm

Metropolis-Hastings Algorithm for Multiple Parameters

Metropolis-Hastings algorithm blockwise

Metropolis-Hastings algorithm independent candidate density

Metropolis-Hastings algorithm random-walk candidate density

Metropolis-Hastings algorithm steps

Metropolis-Hastings algorithms, Markov

© 2024 chempedia.info