Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Computers, large-memory

For large Cl calculations, the frill matrix is not fonned and stored in the computer s memory or on disk rather, direct CF methods [ ] identify and compute non-zero and inunediately add up contributions to the sum jCj. Iterative methods [, in which approximate values for the Cj coefficients are refined tlirough sequential application of to the preceding estimate of the vector, are employed to solve... [Pg.2177]

Importantly for direct dynamics calculations, analytic gradients for MCSCF methods [124-126] are available in many standard quantum chemistiy packages. This is a big advantage as numerical gradients require many evaluations of the wave function. The evaluation of the non-Hellmann-Feynman forces is the major effort, and requires the solution of what are termed the coupled-perturbed MCSCF (CP-MCSCF) equations. The large memory requirements of these equations can be bypassed if a direct method is used [233]. Modem computer architectures and codes then make the evaluation of first and second derivatives relatively straightforward in this theoretical framework. [Pg.301]

The only ingredient in the proof of Life s universality that we have not yet discussed is memory storage. While a finite memory is fairly easy to implement with wires and logic gates - for example, glider-stream-encoded information can be made to circulate around a memory circuit contained within the computer - the construction of an arbitrarily large memory requires a bit more work,... [Pg.149]

All this requires a high speed computer with large memory storage capacity and RAM (something not possible until recently). Image aneifysis software... [Pg.236]

Compared to other methods (molecular mechanics, semiempirical calculations, density functional calculations - Chapters 3, 6 and 7, respectively) ab initio calculations are slow, and they are relatively demanding of computer resources (memory and disk space, depending on the program and the particular calculation). These disadvantages, which increase with the level of the calculation, have been to a very large extent overcome by the tremendous increase in computer power, accompanied by decreases in price, that have taken place since the invention of electronic computers. In 1959 Coulson doubted the possibility (he also questioned the desirability, but in this regard visualization has been of enormous help) of calculations on molecules with more than 20 electrons, but 30 years later computer speed had increased by a factor of 100,000 [329], and ab initio calculations on molecules with 100 electrons (about 15 heavy atoms) are common. [Pg.372]

MS-Windows is an improved operating system for more powerful personal computers which have a large memory (4 to 8 MB) and disc (200 to 400 MB). This gives a very good Graphical User Interface (GUI) which simplifies the use of the computer. It also helps us to allow the multiple programs to be simultaneously stored in memory and executed. [Pg.52]

Further improvement in S/N had to await the development of faster computer microprocessors, which was exactly what happened during the 1980s. Armed with very fast and efficient microcomputers with large memories, chemists discovered they could now generate NMR signals in an entirely new way. [Pg.32]

This arrangement saves computer main memory allowing the generation of synthon precursors/successors of large structures. All information transferred between modules is stored in external files. [Pg.161]

Given a flexible computer program and a large memory-storage capacity machine, what can one do In the ideal case one would like to use an internally consistent force-field which embodies all the following features ... [Pg.6]

Several methods have been published to simulate the time-evolution of an ionization track in water. Monte Carlo (with the IRT method or step-by-step) and deterministic programs including spur diffusion are the main approaches. With the large memory and powerful computer now available, simulation has become more efficient. The modeling of a track structure and reactivity is more and more precise and concepts can now be embedded in complex simulation programs. Therefore corrections of rate constants with high concentrations of solutes in the tracks and the concept of multiple ionizations have improved the calculation of G-values and their dependence on time. [Pg.247]

With development of solid-state detectors, relative x-ray intensities, such as, KP/Ka ratios, have been measured and compiled in the tabulated or graphical forms [2-4]. However, these values are still considered as an atomic property and compared with the theoretical calculations for free atoms [5]. This is because for calculation of x-ray emission rates in molecules it is necessary to perform multi-center integration for molecular wave function. Such calculations are tedious and require a lot of computation time and large memory capacity. [Pg.298]

The experimental study on the chemical effect has been greatly promoted by the recent development of high-energy-resolution x-ray spectrometers. On the other hand, theoretical calculations of x-ray emission rates for molecules become possible by the use of modern high-speed large-memory computers. The x-ray spectroscopy for molecules is now a powerful tool to study molecular electronic structures. [Pg.298]

The heart of the monitor can be a personal computer widely available, inexpensive, and with relatively large memory and computational ability. Although relatively slow, its speed far exceeds the demands of most analyses. Its software will control automatic sample collection, sample manipulation and transport to the measuring chamber, measurement, collection and correlation of other needed environmental data, real time display, and data storage. [Pg.63]

The computer algorithms that carry out the discrete Fourier transform calculation work most efficiently if the number of data points (np) is an integral power of 2. Generally, for basic H and C spectra, at least 16,384 (referred to as 16K ) data points, and 32,768 ( 32K ) points should be collected for full H and spectral windows, respectively. With today s higher field instruments and large-memory computers, data sets of 64K for H and 64-128K for and other nuclei are now commonly used. [Pg.41]


See other pages where Computers, large-memory is mentioned: [Pg.265]    [Pg.265]    [Pg.94]    [Pg.8]    [Pg.545]    [Pg.415]    [Pg.128]    [Pg.16]    [Pg.533]    [Pg.259]    [Pg.159]    [Pg.370]    [Pg.395]    [Pg.180]    [Pg.198]    [Pg.69]    [Pg.27]    [Pg.628]    [Pg.328]    [Pg.473]    [Pg.36]    [Pg.69]    [Pg.545]    [Pg.2298]    [Pg.696]    [Pg.545]    [Pg.46]    [Pg.164]    [Pg.944]    [Pg.27]    [Pg.27]    [Pg.163]    [Pg.51]    [Pg.175]   
See also in sourсe #XX -- [ Pg.113 ]




SEARCH



Computer memory

© 2024 chempedia.info