Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Poisson distribution, computer simulation

The use of the Poisson distribution for this purpose predates the statistical overlap theory of Davis and Giddings (1983), which also utilized this approach, by 9 years. Connors work seems to be largely forgotten because it is based on 2DTLC that doesn t have the resolving power (i.e., efficiency or the number of theoretical plates) needed for complex bioseparations. However, Martin et al. (1986) offered a more modem and rigorous theoretical approach to this problem that was further clarified recently (Davis and Blumberg, 2005) with computer simulation techniques. Clearly, the concept and mathematical approach used by Connors were established ahead of its time. [Pg.12]

Fig. 24.8. Computational simulation analysis of conformational dynamics in T4 lysozyme enzymatic reaction, (a) Histograms of fopen calculated from a simulated single-molecule conformational change trajectory, assuming a multiple consecutive Poisson rate processes representing multiple ramdom walk steps, (b) Two-dimensional joint probability distributions <5 (tj, Tj+i) of adjacent pair fopen times. The distribution <5(ri, Ti+i) shows clearly a characteristic diagonal feature of memory effect in the topen, reflecting that a long topen time tends to be followed by a long one and a short fopen time tends to be followed by a short one... Fig. 24.8. Computational simulation analysis of conformational dynamics in T4 lysozyme enzymatic reaction, (a) Histograms of fopen calculated from a simulated single-molecule conformational change trajectory, assuming a multiple consecutive Poisson rate processes representing multiple ramdom walk steps, (b) Two-dimensional joint probability distributions <5 (tj, Tj+i) of adjacent pair fopen times. The distribution <5(ri, Ti+i) shows clearly a characteristic diagonal feature of memory effect in the topen, reflecting that a long topen time tends to be followed by a long one and a short fopen time tends to be followed by a short one...
Burgess et al. used both computer simulations and an algebraic expression based on the Poisson distribution to analyze the number of beads required to have confidence that every intended compound is present in the library. For typical library sizes, if the number of beads is an order of magnitude greater than the total number of compounds in the library, every compound should be present on at least one bead. Zhao et al." used the Pearson statistic to determine the number of beads needed to be confident that either the smallest individual error or overall relative error in concentration is less than a given threshold. [Pg.96]

Figure 3.11 Temunation rate coefficient versus chain length for Smoluchowski model (o=0.5, b=0.6) with Poisson distribution and thermal background initiation (no transfer to monomer) variation of [i ]o. Symbols are the results of the computer simulations, while the... Figure 3.11 Temunation rate coefficient versus chain length for Smoluchowski model (o=0.5, b=0.6) with Poisson distribution and thermal background initiation (no transfer to monomer) variation of [i ]o. Symbols are the results of the computer simulations, while the...
The first assumptions that should be mentioned are those that were previously used in the time-resolved method. It is again assumed that the radical monodispersity hypothesis holds and that the simple relation between chain length and time (equation 3.27) is valid. Poisson distribution and chain transfer to monomer are thus ignored in this kinetic analysis. Again computer simulations will be used to test the validity of these assumptions. [Pg.95]

A more detailed view of the dynamies of a ehromatin chain was achieved in a recent Brownian dynamics simulation by Beard and Schlick [65]. Like in previous work, the DNA is treated as a segmented elastic chain however, the nueleosomes are modeled as flat cylinders with the DNA attached to the cylinder surface at the positions known from the crystallographic structure of the nucleosome. Moreover, the electrostatic interactions are treated in a very detailed manner the charge distribution on the nucleosome core particle is obtained from a solution to the non-linear Poisson-Boltzmann equation in the surrounding solvent, and the total electrostatic energy is computed through the Debye-Hiickel approximation over all charges on the nucleosome and the linker DNA. [Pg.414]

Although the derivation of Fichthorn and Weinberg only holds for Poisson processes, their method has also been used to simulate TPD spectra. [37] In that work it was assumed that, when At computed with equation (57) is small, the rate constants are well approximated over the interval At by their values at the start of that interval. This seems plausible, but, as the rate constants increase with time in TPD, equation (57) systematically overestimates At, and the peaks in the simulated spectra cire shifted to higher temperatures. In general, if the rate constants are time dependent then it may not even be possible to define the expectation value. We have already mentioned the case of cyclic voltammetry where there is a finite probability that a reaction will not occur at all. The expectation value is then certainly not defined. Even if a reaction will occur sooner or later the distribution Prx(0 has to go faster to zero for t —> oo than 1/t for the expectation value to be defined. Solving equations (48), (52), or (55) does not lead to such problems. [Pg.759]

The evolution of the numerical approaches used for solving the PNP equations has paralleled the evolution of computing hardware. The numerical solution to the PNP equations evolved over the time period of a couple of decades beginning with the simulation of extremely simplified structures " ° to fully three-dimensional models, and with the implementation of sophisticated variants of the algorithmic schemes to increase robustness and performance. Even finite element tetrahedral discretization schemes have been employed successfully to selectively increase the resolution in regions inside the channels. An important aspect of the numerical procedures described is the need for full self-consistency between the force field and the charge distribution in space. It is obtained by coupling a Poisson solver to the Nernst-Planck solver within the iteration scheme described. [Pg.280]

Bascom and Jensen [67], used an approach similar to that of Drzal and coworkers. Wimolkiatisak et al. [70] found that the fragmentation length data fitted both the Gaussian and Weibull distributions equally well. Fraser et al. [71 ] developed a computer model to simulate the stochastic fracture process and, together with the shear-lag analysis, described the shear transmission across the interface. Netravali et al. [72], used a Monte Carlo simulation of a Poisson-Weibull model for the fiber strength and flaw occurrence to calculate an effective interfacial shear strength X using the relationship ... [Pg.624]


See other pages where Poisson distribution, computer simulation is mentioned: [Pg.547]    [Pg.41]    [Pg.389]    [Pg.71]    [Pg.290]    [Pg.389]    [Pg.486]    [Pg.162]    [Pg.245]    [Pg.1651]    [Pg.453]    [Pg.65]    [Pg.118]    [Pg.273]    [Pg.334]    [Pg.132]    [Pg.141]    [Pg.2]    [Pg.2286]    [Pg.179]    [Pg.179]    [Pg.986]    [Pg.5]    [Pg.571]    [Pg.2093]   
See also in sourсe #XX -- [ Pg.207 ]




SEARCH



Computational simulations

Computer simulation

Distributed computing

Distribution simulations

Poisson

Poisson distribution

© 2024 chempedia.info