Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Statistical methods Gaussian probability

In the previous section, the Nerast-Planck equation was developed from the macroscopic flux density and mass conservation equations [(71) and (73), respectively]. The same equation can also be derived by statistical methods, which describe the probability of finding a particle within a volume region at a time t, given an initial distribution and a set of jump probabilities. For the simplest case, in one dimension, with equal probabilities of the particle making a jump to the right or to the left, the time evolution of an initial delta function in concentration at x = 0 is Gaussian ... [Pg.44]

For the discussion of the chaotic behavior statistical methods will be used. The relative cumulative frequency or the probability, respectively, is shown in Figure 7 for four velocity classes, see Table 3. From these data the relative frequency or probability density, respectively, is obtained, see Figure 8. It turns out that the frequency distribution is completely non-Gaussian, and the range characterizing the statistical dispersion is increasing with the relative velocity of the impact, while midrange point and mean value coincide fairly well, see Table 4. [Pg.145]

If the probability distribution of the data is or assumed Gaussian, several statistical measures are available for interpreting the data. These measures can be used to interpret the latent variables determined by a selected data analysis method. Those described here are a combination of statistical measures and graphical analysis. Taken together they provide an assessment of the statistical significance of the analysis. [Pg.55]

These considerations raise a question how can we determine the optimal value of n and the coefficients i < n in (2.54) and (2.56) Clearly, if the expansion is truncated too early, some terms that contribute importantly to Po(AU) will be lost. On the other hand, terms above some threshold carry no information, and, instead, only add statistical noise to the probability distribution. One solution to this problem is to use physical intuition [40]. Perhaps a better approach is that based on the maximum likelihood (ML) method, in which we determine the maximum number of terms supported by the provided information. For the expansion in (2.54), calculating the number of Gaussian functions, their mean values and variances using ML is a standard problem solved in many textbooks on Bayesian inference [43]. For the expansion in (2.56), the ML solution for n and o, also exists, lust like in the case of the multistate Gaussian model, this equation appears to improve the free energy estimates considerably when P0(AU) is a broad function. [Pg.65]

The same result can be recovered in a more precise manner by using the trial probability method. In fact, let us assume that the statistical state of the chain can be represented as a superposition of chains stretched by random forces which, for reasons of simplicity, will be assumed to be Gaussian. [Pg.293]

The theory of cyciization in condensation polymerization was first investigated by Kuhn in the 1930s with the introduction of the concept of effective concentration (Ceff), which is the local concentration of two chain ends of the same molecule for a Gaussian chain. This measurement provides a relationship between the end-to-end length of a polymer chain and that same chain s propensity to cyclize. Therefore, C s provides a method of quantification for the propensity of intramolecular interactions and cyciization, and Kuhn predicted that the cyciization probability decreases as where N is the number of bonds in the chain. Several other treatments have addressed the calculation of C s as a function of chain length using either random-flight statistics or a particle-in-a-sphere approximation. ... [Pg.599]

The probability density functions cannot be stored point by point because they depend on many (d) variables. Therefore several parametric classification methods assume Gaussian distributions and the estimated parameters of these distributions are used to specify a decision function. Another assumption of parametric classifiers are statistically independent pattern features. [Pg.78]

The Andersen thermostat is very simple. After each time step Si, each monomer experiences a random collision with a fictitious heat-bath particle with a collision probability / coll = vSt, where v is the collision frequency. If the collisions are assumed to be uncorrelated events, the collision probability at any time t is Poissonian,pcoll(v, f) = v exp(—vi). In the event of a collision, each component of the velocity of the hit particle is changed according to the Maxwell-Boltzmann distribution p(v,)= exp(—wv /2k T)/ /Inmk T (i = 1,2,3). The width of this Gaussian distribution is determined by the canonical temperature. Each monomer behaves like a Brownian particle under the influence of the forces exerted on it by other particles and external fields. In the limit i —> oo, the phase-space trajectory will have covered the complete accessible phase-space, which is sampled in accordance with Boltzmann statistics. Andersen dynamics resembles Markovian dynamics described in the context of Monte Carlo methods and, in fact, from a statistical mechanics point of view, it reminds us of the Metropolis Monte Carlo method. [Pg.135]

Two classes of probabilistic approach are possible. One class is the direct probability density functional (pdf) approach, often generalising the multivariate Gaussian (normal) distribution. The other class consists of assiunp>-tions and rules that define a stochastic process. In this second class of method it is not usually possible to state an explicit pdf for the interpolants the process must be studied via its sample realisations and their properties. The derivation of standard geostatistical results often appears to be rule based but, as shown in the next few pages, can be derived from an explicit pdf. More research using explicit pdfs could lead to new results and insights into the methods of spatial statistics. [Pg.145]

We now consider probability theory, and its applications in stochastic simulation. First, we define some basic probabihstic concepts, and demonstrate how they may be used to model physical phenomena. Next, we derive some important probability distributions, in particular, the Gaussian (normal) and Poisson distributions. Following this is a treatment of stochastic calculus, with a particular focus upon Brownian dynamics. Monte Carlo methods are then presented, with apphcations in statistical physics, integration, and global minimization (simulated annealing). Finally, genetic optimization is discussed. This chapter serves as a prelude to the discussion of statistics and parameter estimation, in which the Monte Carlo method will prove highly usefiil in Bayesian analysis. [Pg.317]


See other pages where Statistical methods Gaussian probability is mentioned: [Pg.251]    [Pg.434]    [Pg.348]    [Pg.411]    [Pg.123]    [Pg.121]    [Pg.110]    [Pg.648]    [Pg.210]    [Pg.162]    [Pg.331]    [Pg.58]    [Pg.266]    [Pg.293]    [Pg.40]    [Pg.58]    [Pg.276]    [Pg.344]    [Pg.87]    [Pg.176]    [Pg.127]    [Pg.133]    [Pg.304]    [Pg.59]    [Pg.223]    [Pg.3]    [Pg.429]    [Pg.235]    [Pg.165]    [Pg.166]    [Pg.6]    [Pg.200]    [Pg.266]   
See also in sourсe #XX -- [ Pg.184 ]




SEARCH



Gaussian methods

Gaussian methods method

Gaussian statistics

Probability Gaussian

Statistical methods

Statistical probabilities

© 2024 chempedia.info