Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Markov chain theory probability matrix

The basic elements of Markov-chain theory are the state space, the one-step transition probability matrix or the policy-making matrix and the initial state vector termed also the initial probability function In order to develop in the following a portion of the theory of Markov chains, some definitions are made and basic probability concepts are mentioned. [Pg.27]

MARKOV CHAIN THEORY DEFINITION OF THE PROBABILITY MATRIX... [Pg.238]

The instantaneous composition of a copolymer X formed at a monomer mixture composition x coincides, provided the ideal model is applicable, with stationary vector ji of matrix Q with the elements (8). The mathematical apparatus of the theory of Markov chains permits immediately one to wright out of the expression for the probability of any sequence P Uk in macromolecules formed at given x. This provides an exhaustive solution to the problem of sequence distribution for copolymers synthesized at initial conversions p l when the monomer mixture composition x has had no time to deviate noticeably from its initial value x°. As for the high-conversion copolymerization products they evidently represent a mixture of Markovian copolymers prepared at different times, i.e. under different concentrations of monomers in the reaction system. Consequently, in order to calculate the probability of a certain sequence Uk, it is necessary to average its instantaneous value P Uk over all conversions p preceding the conversion p up to which the synthesis was conducted. [Pg.177]

Recall the well-known theorem of the standard theory of the ergodic Markov chains one can state the following (e.g. Feller [4]) In any finite irreducible, aperiodic Markov chain with the transition matrix P, the limit of the power matrices/ exists if r tends to infinity. This limit matrix has identical rows, its rows are the stationary probability vector of the Markov chain, y = [v,Vj,...,v,...,v ], that is v = v P, fiuthermore v, >0 ( = 1,...,R) and... [Pg.663]


See other pages where Markov chain theory probability matrix is mentioned: [Pg.241]    [Pg.164]    [Pg.283]    [Pg.284]    [Pg.315]    [Pg.316]    [Pg.170]    [Pg.161]   
See also in sourсe #XX -- [ Pg.238 , Pg.254 ]




SEARCH



Chain theory

Markov

Markov Chain Theory Definition of the Probability Matrix

Markov chain

Markov chain theory probabilities

Markov matrix

Markovic

Probability theory

© 2024 chempedia.info