Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Computer simulation history

As we have repeatedly seen in this chapter, proponents of computer simulation in materials science had a good deal of scepticism to overcome, from physicists in particular, in the early days. A striking example of sustained scepticism overcome, at length, by a resolute champion is to be found in the history of CALPHAD, an acronym denoting CALculation of PHAse Diagrams. The decisive champion was an American metallurgist, Larry Kaufman. [Pg.482]

In the second section we present a brief overview of some currently used dynamic modeling methods before introducing cellular automata. After a brief history of this method we describe the ingredients that drive the dynamics exhibited by cellular automata. These include the platform on which cellular automata plays out its modeling, the state variables that define the ingredients, and the rules of movement that develop the dynamics. Each step in this section is accompanied by computer simulation programs carried on the CD in the back of the book. [Pg.181]

This is, in essence, the modern synthesis of Darwin and Mendel achieved in the 1930s by Ronald Fisher and J. B. S. Haldane. Based on a series of relatively straightforward equations, it also took the study of evolution out of meticulously observed natural history and located it within a more abstract mathematised theory. Indeed, evolution itself came to be defined not in terms of organisms and populations, but as the rate of change of gene frequencies within any given population. One consequence has been a tendency for theoretical evolutionists to retreat further and further into abstract hypotheticals based on computer simulations, and to withdraw from that patient observation of the natural world which so characterised Darwin s own method . [Pg.283]

When structural and dynamical information about the solvent molecules themselves is not of primary interest, the solute-solvent system may be made simpler by modeling the secondary subsystem as an infinite (usually isotropic) medium characterized by the same dielecttic constant as the bulk solvent, that is, a dielectric continuum. Theoretical interpretation of chemical reaction rates has a long history already. Until recently, however, only the chemical reactions of systems containing a few atoms in the gas phase could be studied using molecular quantum mechanics due to computational expense. Fortunately, very important advances have been made in the power of computer-simulation techniques for chemical reactions in the condensed phase, accompanied by an impressive progress in computer speed (Gonzalez-Lafont et al., 1996). [Pg.286]

One of the major difficulties in developing theories of the rheology of coagulated or flocculated dispersions is that the microstructures of the aggregates are nonequilibrium structures under shear. Understandably, the rheology of such dispersions is history dependent, as we have seen above, and requires computer simulations and nonequilibrium statistical mechanics for proper study. [Pg.181]

Computer simulations of bimolecular reactions for a system of immobile particles (incorporating their production) has a long history see, e.g., [18-22]. For the first time computer simulation as a test of analytical methods in the reaction kinetics was carried out by Zhdanov [23, 24] for d, = 3. Despite the fact that his simulations were performed up to rather small reaction depths, To < 1, it was established that of all empirical equations presented for the tunnelling recombination kinetics (those of linear approximation - (4.1.42) or (4.1.43)) turned out to be mostly correct (note that equations (5.1.14) to (5.1.16) of the complete superposition approximation were not considered.) On the other hand, irrespective of the initial reactant densities and space dimension d for reaction depths T To his theoretical curves deviate from those computer simulated by 10%. Accuracy of the superposition approximation in d = 3 case was first questioned by Kuzovkov [25], it was also... [Pg.256]

Wood, W. W., Early history of computer simulations in statistical mechanics. In International School of Physics Enrico Fermi, volume XCVII, pp. 3-14. Bologna Soc. Italiana di Fisica (1986). [Pg.227]

Design of field projects using surfactants selected in Step 7 and a combination of laboratory floods at reservoir conditions, computer simulators, and reservoir history matching. [Pg.12]

As the external field strength decreases, the fall transients approach the equilibrium autocorrelation function. The deexdtation effect is clearly, therefore, a function of the external field first applied to the sample and then removed, that is, the fall transient behavior from the computer simulation depends on the history of the field application. [Pg.221]

Because of the extreme dependence on initial conditions, our history analysis concentrates on an air mass with relatively well-defined concentrations at the beginning and the end of its travel. Giving it the initial values, we see the concentrations unfold as the air parcel moves through the computed simulation procedure. Because of the sensitivities discovered, the transition of oxidant species O3 and NO2 proceeds better than one might expect. The previously adopted biases on the NO-flux and the propylene oxidation rates were confirmed in this run having different conditions from those in Huntington Park represented by 1968 data. [Pg.163]

Computer simulations of many-body systems have nearly as long history as the modem computers. [1] Along with the rapid development in the computer technology, the molecular computer simulations and particularly the classical Molecular Dynamics (MD) methods, treating the atoms and the molecules as classical particles, have developed in the last three decades to an important discipline to obtain information about thermod)mamics, stmcture and dynamical properties in condensed matter from pure simple liquids to studies of complex biomolecular systems in solution. [2]... [Pg.97]

As was evidenced above, the attempts to derive the zone profiles by solution of the appropriate differential equations meet severe difficulties. In such situations, it seems reasonable to trace the influence of various variables, as well as of the experimental conditions, by computer simulation of the histories of single molecular entities migrating down the column. [Pg.100]

The history of the phase rule is the subject of a review written for its centenary [4] that history need not concern our discussion here because it reviews the applications and the controversies over the derivation of the rule. Here, we concern ourselves with the deviations and apparent violations of the phase rule, a topic essentially untouched until computer simulations of small systems suggested that the phase rule might, after all, not be so universal. [Pg.222]

The least emphasized interest, but the one that must be adequately explored before the full significance of the more popular (structural) applications can be properly assessed, concerns the investigation by molecular dynamics of the kinetically determined, hence history-dependent, nature of the glass transition itself. It is through this irreversible process that the configurationally frozen material—to which the word glass pertains—is obtained, and in laboratory studies it is well known that the structure and properties of the material obtained depend on the precise manner in which it was formed. Two questions need to be answered here before the usefulness of computer simulation methods to the study of the glassy state of common experience is properly established. [Pg.399]

Fig. 4 (a) Mean bond force in a freely jointed long-chain molecule with A = 10 bonds of equilibrium length tq and fixed end-to-end displacement R. (b) Typical time history of bond force/ (t) as determined by computer simulation with Ai= 10 at room temperature. Mean bond force and its root mean square deviations are shown as solid and dashed horizontal lines, respectively. Adapted with permission from [63]... [Pg.7]

To conclude, historical simulation with all its variants is another method that relies on past data to predict the future. It has problems coping with complex instruments, instruments with no history, and where the number of observations is limited. We look at a method that uses numerous computer simulations to overcome this, the Monte Carlo simulation, in the next section. [Pg.794]

In addition to the two methods described for finding the MRT (most responsive temperature) point in a distillation column, there are three additional methods that can be used without a computer simulation. First, if there are temperature points in the column that are monitored, just look at the history record to see which temperature point moves the most. Second, find the temperature point in the column that is nearest to being halfway between the top temperature and the bottom temperature. Third, look in the area of the column where the vapor concentration would be about 50% light key component. For example, if the feed contains 10% light key component, there is a high probability that the MRT would be above the feed point. Similarly, if the feed contains 80% light key, there is a high probability that the MRT would be below the feed point. [Pg.29]

W.W.Wood, Early History of Computer Simulation in Statistical Mechanics, in Proceedings of the Enrico Fermi Summer School, Varenna, 1986, 3. [Pg.65]


See other pages where Computer simulation history is mentioned: [Pg.426]    [Pg.958]    [Pg.113]    [Pg.159]    [Pg.125]    [Pg.74]    [Pg.259]    [Pg.291]    [Pg.156]    [Pg.345]    [Pg.580]    [Pg.1967]    [Pg.6]    [Pg.120]    [Pg.80]    [Pg.410]    [Pg.156]    [Pg.231]    [Pg.196]    [Pg.281]    [Pg.307]    [Pg.320]    [Pg.334]    [Pg.324]    [Pg.334]    [Pg.1]    [Pg.327]    [Pg.499]   
See also in sourсe #XX -- [ Pg.465 ]




SEARCH



Computational simulations

Computed history

Computer simulation

Simulation history

© 2024 chempedia.info