Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Markovic

Ross Kinderman and J. Laurie Snell Markov fields and their applications. Contemporary Mathematics v. 1. American Mathematical Society, 1980. [Pg.120]

A. Mohammad-Djafari M. Nikolova. Eddy current tomography using binary markov model. Signal Processing, 49 119-132, 1996. [Pg.333]

While the Smoliichowski equation is necessary for a Markov process, in general it is not sufficient, but known counter-examples are always non-Gaiissian as well. [Pg.694]

Using W2 = 17jP2, (A3.2.81 and (A3.2.9) may be used to satisfy the Smoluchowski equation, (A3.2.2). another necessary property for a stationary process. Thus u(t) is an example of a stationary Gaussian-Markov... [Pg.695]

Markov process [12], Denoting tire inverse of R.j by and using the definition... [Pg.697]

A proposal based on Onsager s theory was made by Landau and Lifshitz [27] for the fluctuations that should be added to the Navier-Stokes hydrodynamic equations. Fluctuating stress tensor and heat flux temis were postulated in analogy with the Onsager theory. Flowever, since this is a case where the variables are of mixed time reversal character, tlie derivation was not fiilly rigorous. This situation was remedied by tlie derivation by Fox and Ulilenbeck [13, H, 18] based on general stationary Gaussian-Markov processes [12]. The precise fomi of the Landau proposal is confimied by this approach [14]. [Pg.705]

Onsager s theory can also be used to detemiine the fomi of the flucUiations for the Boltzmaim equation [15]. Since hydrodynamics can be derived from the Boltzmaim equation as a contracted description, a contraction of the flucUiating Boltzmann equation detemiines fluctuations for hydrodynamics. In general, a contraction of the description creates a new description which is non-Markovian, i.e. has memory. The Markov... [Pg.707]

Green M S 1954 Markov random processes and the statistical mechanics of time-dependent phenomena. II. Irreversible processes in fluids J. Chem. Phys. 22 398... [Pg.715]

To conclude this section it should be pointed out again that the friction coefficient has been considered to be frequency independent as implied in assuming a Markov process, and that zero-frequency friction as represented by solvent viscosity is an adequate parameter to describe the effect of friction on observed reaction rates. [Pg.851]

The key quantity in barrier crossing processes in tiiis respect is the barrier curvature Mg which sets the time window for possible influences of the dynamic solvent response. A sharp barrier entails short barrier passage times during which the memory of the solvent environment may be partially maintained. This non-Markov situation may be expressed by a generalized Langevin equation including a time-dependent friction kernel y(t) [ ]... [Pg.852]

The Boltzmaim weight appears implicitly in the way the states are chosen. The fomi of the above equation is like a time average as calculated in MD. The MC method involves designing a stochastic algorithm for stepping from one state of the system to the next, generating a trajectory. This will take the fomi of a Markov chain, specified by transition probabilities which are independent of the prior history of the system. [Pg.2256]

Markovic N and Billing G D 1997 Semi-classical treatment of chemical reactions extension to 3D wave packets Chem. Phys. 224 53... [Pg.2329]

We have tacitly assumed that the rate constants depend only on the last unit of the chain. In such a situation, the copolymerization is called a Markov copolymerization of first order. The special case (i), r r- = 1, is a Markov copolymerization of order zero. If reactivity also depends on the penultimate unit of the chain, the polymerization is a Markov copolymerization of second order. [Pg.2516]

Tidsweii i M, Lucas C A, Markovic N M and Ross P N 1995 Surface structure determination using anomaious x-ray scattering Underpotentiai deposition of copper on Pt(111) Phys. Rev. B 51 10 205-8... [Pg.2757]

Markovic N M, Gasteiger H A and Ross P N 1995 Copper electrodeposition on Pt(111) in the presence of chloride and (bi)sulphate Rotating ring-Pt(111) disk electrode studies Langmuir 11 4098-108... [Pg.2759]

P. Deuflhard, W. Huisinga, A. Fischer, Ch. Schiitte. Identification of Almost Invariant Aggregates in Nearly Uncoupled Markov Chains. Preprint, Preprint SC 98-03, Konrad Zuse Zentrum, Berlin (1998)... [Pg.115]

The equilibrium distribution of the system can be determined by considering the result c applying the transition matrix an infinite number of times. This limiting dishibution c the Markov chain is given by pij jt = lim, o p(l)fc -... [Pg.431]

Closely related to the transition matrix is the stochastic matrix, whose elements are labelle a . TTiis matrix gives the probability of choosing the two states m and n between whic the move is to be made. It is often known as the underlying matrix of the Markov chain, the probability of accepting a trial move from m to n is then the probability of makir a transition from m to n (7r, ) is given by multiplying the probability of choosing states... [Pg.431]

Fig. 10.22 Hidden Markov model used for protein sequence analysis, are match states (corresponding in this... Fig. 10.22 Hidden Markov model used for protein sequence analysis, are match states (corresponding in this...
Mian, K Sjolander and D Haussler 1994. Hidden Markov Models in Computational Biology. Applications to Protein Modelling. Journal of Molecular Biology 235 1501-1531). [Pg.553]

Having built a hidden Markov model for a particular family of proteins, it can then b< used to search a database. A score is computed for each sequence in the database anc those sequences that score significantly more than other sequences of a similar length ar( identified. This was demonstrated for two key families of proteins, globins and kinases ii the original paper [Krogh et al. 1994]. For the kinases, 296 sequences with a Z score abov<... [Pg.553]

Eddy S R 1996. Hidden Markov Models. Current Opinion in Structural Biology 6 361-365. [Pg.575]

Rabiner L R 1989. A Tutorial on Hidden Markov Models and Selected Applications in Spe Recognition. Fhroceedings cf the IEEE TI-lSl-19rB. [Pg.577]


See other pages where Markovic is mentioned: [Pg.114]    [Pg.692]    [Pg.692]    [Pg.693]    [Pg.693]    [Pg.693]    [Pg.694]    [Pg.833]    [Pg.848]    [Pg.2257]    [Pg.2758]    [Pg.2759]    [Pg.2759]    [Pg.91]    [Pg.560]    [Pg.632]    [Pg.737]    [Pg.749]    [Pg.430]    [Pg.552]    [Pg.552]    [Pg.564]    [Pg.239]    [Pg.479]    [Pg.480]   
See also in sourсe #XX -- [ Pg.120 ]

See also in sourсe #XX -- [ Pg.23 , Pg.533 ]

See also in sourсe #XX -- [ Pg.120 ]




SEARCH



Adaptive Markov Chain Monte Carlo Simulation

Aperiodic Markov chains

Applications of Markov Chains in Chemical Reactions

Applications of Markov Chains in Chemical Reactors

Chain copolymerization first-order Markov model

Compliance Markov

Composite Markov process

Continuous Markov process

Continuous Markov processes, probability

Continuous Markov processes, probability times

Continuous-lag Markov Chains

Conventional Markov-chain Monte Carlo sampling

Copolymers with first-order Markov sequence distributions

Esin and Markov coefficient

Esin-Markov coefficients

Esin-Markov effect

Finite Markov processes

First-order Markov

First-order Markov distributions

First-order Markov mechanism

First-order Markov model

First-order Markov model copolymers

First-order Markov model sequence distributions

First-order Markov process

First-order Markov statistics

Fractals, Markov

Frequency analysis Markov models

Fundamentals of Markov Chains

Gauss-Markov conditions

Gauss-Markov theorem

Gaussian Markov process

Generalized hidden Markov model

Gibbs sampler, Markov chain Monte Carlo

Gibbs sampler, Markov chain Monte Carlo methods

Going Forward with Markov Chain Monte Carlo

Hidden Markov methods

Hidden Markov model , domain

Hidden Markov model , domain alignments

Hidden Markov model states

Hidden Markov model training

Hidden Markov model-based method

Hidden Markov models , labelling

Hidden Markov models , synthesis from

Hidden Markov models technique

Hidden Markov tree

Hidden semi-Markov model

Hierarchical Markov models

Higher-order Markov chains

Homogeneous Markov process

Integration, method Markov chains

Irreducibility of Finite Markov Chains

Markov

Markov

Markov Analyses

Markov CTRW Models

Markov Chain Monte Carlo Sampling from Posterior

Markov Chain Theory Definition of the Probability Matrix

Markov Chain model

Markov Chains Discrete in Time and Space

Markov Chains with Continuous State Space

Markov Decision Process

Markov Localization

Markov Method

Markov Modeling

Markov approximation

Markov approximation analysis

Markov approximation continuous processes

Markov approximation evolution times

Markov approximation probability

Markov approximation systems

Markov approximation techniques

Markov assumption

Markov chain

Markov chain Monte Carlo

Markov chain Monte Carlo method

Markov chain Monte Carlo sampling

Markov chain Monte Carlo simulation

Markov chain analysis

Markov chain assumptions

Markov chain continuous state space

Markov chain continuous time

Markov chain discrete time

Markov chain mechanism, first order

Markov chain principles

Markov chain theory

Markov chain theory definition

Markov chain theory probabilities

Markov chain theory probability matrix

Markov chains Metropolis algorithm

Markov chains detailed balance

Markov chains ergodic

Markov chains first-order

Markov chains in Monte Carlo

Markov chains irreducible

Markov chains second-order

Markov chains steady state equation

Markov chains time invariant

Markov chains time-reversible

Markov chains, processes

Markov connection

Markov decision models

Markov diffusion

Markov dynamics

Markov equations

Markov evolution

Markov growth model

Markov jump process

Markov limit

Markov master equation

Markov matrix

Markov matrix algorithm

Markov matrix sampling

Markov mechanism

Markov model

Markov model copolymerization

Markov model solution techniques

Markov model stereochemistry

Markov modelling technique

Markov models, hidden

Markov process

Markov processes diffusion process

Markov processes discrete

Markov processes master equation

Markov property

Markov renewal

Markov second-order mechanism,

Markov statistics configurational sequences

Markov stochastic processes

Markov theorem

Markov transition model

Markov walk

Markov, George

Markov, Georgi

Markov-type stochastic process

Markov’s method

Metropolis algorithm, Markov chain Monte

Metropolis-Hastings algorithms, Markov

Monte Carlo Markov process

Monte Markov chain

Non-Markov processes

Probability distribution continuous Markov processes

Propagation steps Markov model

Sampling from a Markov Chain

Second-order Markov

Second-order Markov model

Semi-Markov process

Stationary Markov process

Statistic Markov

Stochastic simulation Markov chain

Stochastic simulation Markov process

Stochastic stationary Markov process

The Markov chain theory for ternary systems

The Random-Walk Markov Matrix

Time homogeneous Markov process

Time-Invariant Markov Chains with Finite State Space

Time-Reversible Markov Chains and Detailed Balance

Transitional Markov chain Monte Carlo

Transitional Markov chain Monte Carlo simulation

Wavelet-Domain Hidden Markov Models

Zero-order Markov model

© 2024 chempedia.info