Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Markov chain models

Dazhi S, Xuqian L (2010) Application of Markov chain model on environmental fate of phenanthrene in soil and groundwater. Procedia Environ Sci 2 814—823... [Pg.70]

J.H.B. Kemperman, The Passage Problem for a Stationary Markov Chain (University of Chicago Press, Chicago 1961) J. Keilson, Markov Chain Models (Springer, New York 1979). [Pg.292]

It is necessary to note that the matrix K does not deal with the hold-ups at all. The same rule for its construction was suggested in [8,9] on the basis of particle fractions flows balance only, without referring to Markov chains models. Here it was obtained as a particular case of the developed model, which is presented by the matrix P and allows describing also the evolution of hold-ups in general case provided the smaller matrices of P are known. [Pg.270]

There are many advantages of using the discrete Markov-chain model in Chemical Engineering, as follows. [Pg.8]

The simplest model [73, p.l80] of a two impinging-stream reactor is shown schematically in Fig.4.S-l. On the LHS is demonstrated the actual configuration of the reactor and on the RHS the Markov-chain model. The latter employs the following considerations and assumptions ... [Pg.464]

F. D. Anita and S.Lee, The Effect of Stoichiometry on Markov chain Models for Chemical Reaction Kinetics, Chemical Engineering Science, 40, 1969-1971(1985). [Pg.600]

There are many advantages, detailed in Chapter 1, of using the discrete Markov-chain model in Chemical Engineering. Probably, the most important advantage is that physical models can be presented in a unified description via state vector and a one-step transition probability matrix. Consequently, a process is demonstrated solely by the probability of a system to occupy a state or not to occupy it. William Shakespeare profoundly stated this in the following way " to be (in a state) or not to be (in a state), that is the question". [Pg.611]

If a subject did not take his/her medication given that it was not taken the time before, or did take the medication given that it was taken the previous dosing time, a two-state Markov chain model can be fully defined by using two conditional probabilities. By defining 7 = (yi, y2, , Tn) as a random vector indicating whether a patient/subject has not taken his/her medication (y = NT) or has taken it (y = T) at /th time, then... [Pg.168]

S. Jain, Markov chain model and its applications. Comp Biomed Res 19 374-378 (1986). [Pg.697]

B. Kemp and H. A. C. Kamphuisen, Simulation of human hypnograms using a Markov chain model. Sleep 9 405 14 (1986). [Pg.697]

There are some other weight functions that are used to search for functional signals, for example, weights can be received by optimization procedures such as perceptrons or neural networks [29, 30]. Also, different position-specific probability distributions p can be considered. One typical generalization is to use position-specific probability distributions pf of k-base oligonucleotides (instead of mononucleotides), another one is to exploit Markov chain models, where the probability to generate a particular nucleotide xt of the signal sequence depends on k0 1 previous bases (i.e. [Pg.87]

Hulliger, J., Polarity formation Markov chain model , in Encyclopedia of Supramolecular Chemistry, Atwood,... [Pg.403]

Then a discrete time Markov chain model with the system observed just prior to the beginning of transfer can be used to show that the line efiSdency is given by... [Pg.1647]

Inference of Markov Chain Models by Using k-Testable Language Application on Aging People... [Pg.89]

From a set of labelled (positive only) time stamped event sequences, the problem to solve is to find the automaton model that most likely produce the data. We do not want to learn (identify) a DRTA such that [22], with time constraints because we do not exactly have the same problematic of real-time system. We only have timed-strings from which we propose to automatically deduce a Markov chain model. [Pg.95]

Converting this probabilistic DFA in a Markov chain model. [Pg.96]

In order to obtain the Markov chain model, we have to compute the probabilities ... [Pg.102]

Although at first sight they appear different, ID Markov chain models are formally equivalent to Ising models. In the Ising formulation, it is more convenient to change from the (0,1) variables to <7, (— 1. +1) variables, where... [Pg.460]

This simple pair-interaction Ising model is equivalent to the Markov chain model of Eq. 9 for mA=0.5. The interaction parameter J is related to the correlation [i by the following ... [Pg.460]

We consider for simplicity the displacement equivalent of the simple ID Markov chain model given in Eq. 9. We suppose that x, is now a zero-mean normally distributed random variable that represents the longitudinal displacement of the site i from its regular position on a ID lattice... [Pg.460]


See other pages where Markov chain models is mentioned: [Pg.1047]    [Pg.262]    [Pg.267]    [Pg.437]    [Pg.213]    [Pg.8]    [Pg.8]    [Pg.48]    [Pg.856]    [Pg.110]    [Pg.9]    [Pg.89]    [Pg.104]    [Pg.1120]    [Pg.1121]    [Pg.1122]    [Pg.1123]    [Pg.1124]    [Pg.1125]    [Pg.1126]    [Pg.1126]    [Pg.1127]   
See also in sourсe #XX -- [ Pg.671 ]




SEARCH



Chain copolymerization first-order Markov model

Markov

Markov Modeling

Markov chain

Markovic

© 2024 chempedia.info