Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Supercomputer simulation time

To integrate the equations of motion in a stable and reliable way, it is necessary that the fundamental time step is shorter than the shortest relevant timescale in the problem. The shortest events involving whole atoms are C-H vibrations, and therefore a typical value of the time step is 2fs (10-15s). This means that there are up to one million time steps necessary to reach (real-time) simulation times in the nanosecond range. The ns range is sufficient for conformational transitions of the lipid molecules. It is also sufficient to allow some lateral diffusion of molecules in the box. As an iteration time step is rather expensive, even a supercomputer will need of the order of 106 s (a week) of CPU time to reach the ns domain. [Pg.39]

The Need for Supercomputers in Time-Dependent Polymer Simulations... [Pg.138]

However, despite of the great importance of quantum mechanical potentials from the purely theoretical point of view, simple effective two-body potential functions for water seem at present to be preferable for the extensive simulations of complex aqueous systems of geochemical interest. A very promising and powerful method of Car-Parrinello ah initio molecular dynamics, which completely eliminates the need for a potential interaction model in MD simulations (e.g., Fois et al. 1994 Tukerman et al. 1995, 1997) still remains computationally extremely demanding and limited to relatively small systems N < 100 and a total simulation time of a few picoseconds), which also presently limits its application for complex geochemical fluids. On the other hand, it may soon become a method of choice, if the current exponential growth of supercomputing power will continue in the near future. [Pg.95]

The work described in this paper is an illustration of the potential to be derived from the availability of supercomputers for research in chemistry. The domain of application is the area of new materials which are expected to play a critical role in the future development of molecular electronic and optical devices for information storage and communication. Theoretical simulations of the type presented here lead to detailed understanding of the electronic structure and properties of these systems, information which at times is hard to extract from experimental data or from more approximate theoretical methods. It is clear that the methods of quantum chemistry have reached a point where they constitute tools of semi-quantitative accuracy and have predictive value. Further developments for quantitative accuracy are needed. They involve the application of methods describing electron correlation effects to large molecular systems. The need for supercomputer power to achieve this goal is even more acute. [Pg.160]

Large-scale numerical simulation for samples that are many times os large as the critical wavelength is perhaps the only way to develop a quantitative understanding of the dynamics of solidification systems. Even for shallow cells, such calculations will be costly, because of the fine discretizations needed to be sure the dynamics associated with the small capillary length scales are adequately approximated. Such calculations may be feasible with the next generation of supercomputers. [Pg.329]

How well has Dill s prediction held up In 2000, the first ever microsecond-long molecular dynamics simulation of protein folding was reported. It required 750,000 node hours (equal to the product of the number of hours times the number of processors) of computer time on a Cray T3 supercomputer. According to Dill s prediction, this length of simulation was not to be expected until around 2010. However, as noted above, Dill s analysis does not take into account large-scale parallelization—which, unless the computation is communications-limited, will effectively increase the speed of a computation in proportion to the number of processors available. [Pg.81]

Time scales for various motions within biopolymers (upper) and nonbiological polymers (lower). The year scale at the bottom shows estimates of when each such process might be accessible to brute force molecular simulation on supercomputers, assuming that parallel processing capability on supercomputers increases by about a factor of 1,000 every 10 years (i.e., one order of magnitude more than Moore s law) and neglecting new approaches or breakthroughs. Reprinted with permission from H.S. Chan and K. A. Dill. Physics Today, 46, 2, 24, (1993). [Pg.81]

Stocks, Wang, and their colleagues used an extremely powerful computer made by Cray, Inc., called an XT3, to simulate the behavior of a few thousand atoms in the material. This computer, located at the Pittsburgh Supercomputer Center, has 2,048 computer processors to speed up operations it so fast it is called a supercomputer. Even so, a simulation of 14,400 atoms, including all their important interactions, required 50 hours of computer time. (Since the computer was shared with other users, not all the processors were devoted to this simulation.)... [Pg.23]

There has been a phenomenal growth of interest in theoretical simulations over the past decade. The concomitant advances made in computing power and software development have changed the way that computational chemistry research is undertaken. No longer is it the exclusive realm of specialized theoreticians and supercomputers rather, computational chemistry is now accessible via user-friendly programs on moderately priced workstations. State-of-the-art calculations on the fastest, massively parallel machines are continually enlarging the scope of what is possible with these methods. These reasons, coupled with the continuing importance of solid acid catalysis within the world s petrochemical and petroleum industries, make it timely to review recent work on the theoretical study of zeolite catalysis. [Pg.1]

We acknowledge financial support from the National Science Foundation (CHE-0111629 and CHE-9800184) and R. A. Welch Fbundation (A-0648 and A-0924). We also thank the Supercomputing Facility at Texas A M University for computer time and the Laboratory for Molecular Simulation at Texas A M University for computer time and software. [Pg.24]

These two techniques have several features in common. Accurate results can be expected, provided that the simulation runs are carried on long enough and that the number of molecules is large enough. In practice, the results are limited by the speed and storage capacity of current supercomputers. Typically, the number of molecules in the sample simulated can range up to a few thousand or tens of thousands for small molecules, the real time simulated in MD is of the order of a nanosecond. [Pg.132]


See other pages where Supercomputer simulation time is mentioned: [Pg.7]    [Pg.108]    [Pg.354]    [Pg.371]    [Pg.80]    [Pg.328]    [Pg.85]    [Pg.417]    [Pg.193]    [Pg.112]    [Pg.2448]    [Pg.95]    [Pg.305]    [Pg.2277]    [Pg.78]    [Pg.568]    [Pg.839]    [Pg.152]    [Pg.156]    [Pg.165]    [Pg.83]    [Pg.124]    [Pg.237]    [Pg.403]    [Pg.759]    [Pg.17]    [Pg.147]    [Pg.319]    [Pg.121]    [Pg.74]    [Pg.551]    [Pg.149]    [Pg.515]    [Pg.6]    [Pg.270]    [Pg.168]    [Pg.354]    [Pg.147]   
See also in sourсe #XX -- [ Pg.108 ]




SEARCH



Simulation time

Supercomputer simulations

Supercomputers

Timing simulation

© 2024 chempedia.info