Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Markov chain Monte Carlo sampling

Brown, S., Head-Cordon, T. Cool walking a new Markov chain Monte Carlo sampling method. J. Comput. Chem. 2003, 24, 68-76. [Pg.75]

Suppose that configurations visited by a thermal trajectory, for example from Markov Chain Monte Carlo, sample points uniformly in the allowed area. Ixt M be the total number of points involved and m(A) be the number... [Pg.103]

This is an example of a more general technique called Markov chain Monte-Carlo sampling where, instead of exhaustively searching a state space, one starts from a random state and moves through the space in a stochastic fashion such that, in the limit of long time, each state is visited in proportion to its posterior probability. [Pg.385]

In this chapter we introduce Markov chains. These are a special type of stochastic process, which are processes that move around a set of possible values where the future values can t be predicted with certainty. There is some chance element in the evolution of the process through time. The set of possible values is called the state space of the process. Markov chains have the "memoryless" property that, given the past and present states, the future state only depends on the present state. This chapter will give us the necessary background knowledge about Markov chains that we will need to understand Markov chain Monte Carlo sampling. [Pg.101]

Markov chain after it has been running a long time it can be considered a random draw from the posterior. This method for drawing a sample from the posterior is known as Markov chain Monte Carlo sampling. [Pg.102]

Markov Chain Monte Carlo Sampling from Posterior... [Pg.127]

MARKOV CHAIN MONTE CARLO SAMPLING FROM POSTERIOR... [Pg.128]

STATISTICAL INFERENCE FROM A MARKOV CHAIN MONTE CARLO SAMPLE... [Pg.160]

After we have let the chain run a long time, the state the chain is in does not depend on the initial state of the chain. This length of time is called the burn-in period. A draw from the chain after the bum-in time is approximately a random draw from the posterior. However, the sequence of draws from the chain after that time is not a random sample from the posterior, rather it is a dependent sample. In Chapter 3, we saw how we could do inference on the parameters using a random sample from the posterior. In Section 7.3 we will continue with that approach to using the Markov chain Monte Carlo sample from the posterior. We will have to thin the sample so that we can consider it to be approximately a random sample. A chain with good mixing properties will require a shorter burn-in period and less thinning. [Pg.160]

Markov chain Monte Carlo samples are not independent random samples. This is unlike the case for samples drawn directly from the posterior by acceptance-rejection sampling. This means that it is more difficult to do inferences from the Markov chain Monte carlo sample. In this section we discus the differing points of view on this problem. [Pg.168]

Advocates of the first point of view would consider this to be very inefficient. We are discarding a number of draws equal to the burn-in time for each value into the random sample. They would say this is throwing away information. However, it is not data that is being discarded, rather it is computer-generated Markov chain Monte Carlo samples. If we want more, we can get them by running the Markov chain longer. [Pg.169]

Examining sample autocorrelation function of the Markov chain Monte Carlo Sample. In Section 7.1, we saw that the traceplots of the Markov chain output are different for the different types of candidate distributions. Using a random-walk candidate distribution leads to a high proportion of accepted candidates, but relatively small moves at each step. This can lead to high autocorrelations at low lags. With an independent candidate distribution, there is a much smaller proportion of accepted candidates, but relatively large moves. Since many candidates are not... [Pg.169]

Example 8 (continued) Suppose we take a Markov chain Monte Carlo sample from... [Pg.170]


See other pages where Markov chain Monte Carlo sampling is mentioned: [Pg.171]    [Pg.171]    [Pg.20]    [Pg.21]    [Pg.154]    [Pg.154]    [Pg.154]    [Pg.154]    [Pg.155]    [Pg.155]    [Pg.155]    [Pg.155]    [Pg.156]    [Pg.156]    [Pg.156]    [Pg.168]    [Pg.169]   
See also in sourсe #XX -- [ Pg.171 ]

See also in sourсe #XX -- [ Pg.171 ]




SEARCH



Markov

Markov chain

Markov chain Monte Carlo

Markovic

Monte Carlo sampling

Monte Markov chain

Monte-Carlo chains

© 2024 chempedia.info