Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Memory intensity

The question as to whether and to what extent and in what area optical mass storage would replace magnetic systems (disk, tape) was controversially being discussed in the 1980s. In spite of all predictions of an imminent substitution, as of late 1994 magnetic hard disks stiU are the system of choice for computer-dedicated mass storage due to their speed (access time, transfer rate), physical size, and energy consumption this is especially tme when memory-intensive appHcations are mn which use the hard disk as virtual memory. [Pg.164]

Eggels and Somers (1995) used an LB scheme for simulating species transport in a cavity flow. Such an LB scheme, however, is more memory intensive than a FV formulation of the convective-diffusion equation, as in the LB discretization typically 18 single-precision concentrations (associated with the 18 velocity directions in the usual lattice) need to be stored, while in the FV just 2 or 3 (double-precision) variables are needed. Scalar species transport therefore can better be simulated with an FV solver. [Pg.176]

Although the use of a pseudopotential formalism helps significantly in reducing the size of a plane wave basis set, typical expansions still include impressive numbers of 10,000-100,000 plane wave coefficients. All of these have to be propagated simultaneously during the dynamics this makes AIMD approaches highly memory intensive. [Pg.14]

As already mentioned, two-dimensional simulations can be very CPU and memory intensive, so techniques which can improve the efficiency (i.e. accuracy for a given number of nodes) are important. Using higher-order Taylor expansions to represent derivatives is a possibility, but the only area where a significant improvement in accuracy may be achieved is in the evaluation of the flux when calculating the current (Britz, 1988 Gavaghan, 1997). [Pg.95]

Another 3D demonstration that used adhesive bonding included In-Au microbumps for interwafer interconnection and was reported by Lee et al. in 2000 [45]. A large shared cache memory was stacked above a processor to enable memory-intensive applications such as multiple processor computing, using 1.5-pm gate-length technology and doped polysilicon. [Pg.439]

Note The ICMs are high-memory-use programs. Because of the memory intensive nature of the ICMs, there have been intermiitent problems (iCM5% of Windows computers) with the modules. You can usually solve the problem by trying the ICM on a different computer. In the Heatfx 2 ICM, only the first three reactors can be solved, and users cannot continue on to part 2 because of a bug currently in the program. [Pg.1044]

The major problem with exact diagonalization methods is the exponential increase in dimensionality of the Hilbert space with the increase in the system size. Thus, the study of larger systems becomes not only CPU intensive but also memory intensive as the number of nonzero elements of the matrix cdso increases with system size. With increasing power of the computers, slightly larger problems have been solved every few years. To illustrate this trend, we consider the case... [Pg.135]

The resolution of a SIMION model is limited by manory constraints. The accuracy of ion trajectory simulations is highly dependent on the spatial resolution of the PA naturally, higher resolution models provide better approximations of smooth electrode surfaces and hence a better description of the electric field. Unfortunately, high-resolution models are memory-intensive each point in the PA requires 8-10 bytes of dynamic memory. SIMION v. 8.0 has an upper limit of 2x 10 points, which corresponds to ca 1.8 GB of RAM. Thus, the maximum cubic PA allowed is approximately 580x 580x 580 points. However, in order to run efficiently simulations of dynamic... [Pg.265]

Most statistical clustering methods are memory intensive and are simply unwieldy if the data set is too large. Also, some methods rely on assumptions (normal distribution, etc.) about data in forming clusters. So, if your data set is large or if it does not meet the necessary assumptions you may be better off using an ANN. [Pg.70]

This is a memory intensive event counter and as such, often holds the smallest amount of information, at least with respect to total duration of monitoring. In a time-based event counter, events are logged as to pacing state and rate with respect to time. When retrieved and displayed, they identify pacemaker behavior relatively precisely with respect to time. This will facilitate an assessment of the pacemaker s response to specific activities and even symptoms if there is a way of storing the pacing system behavior associated with specific events... [Pg.671]

Gives information on self-similarity (periodic structure) and similarity between two waveforms, respectively. Computationally intensive if long samples are used. Also memory intensive (Kondoz, 2004). [Pg.89]

This is the version of X2C as it has been implemented in NWChem [14], Due to the RKB condition, for simplicity the code currently requires a fiilly uncontracted basis set. It has been demonstrated that local decoupling schemes are suitable whereby an atomic and nearest-neighbor partitioning is employed in order to render the constmction of the X2C Hamiltonian matrix less CPU and memory intensive [53,85,86]. [Pg.315]


See other pages where Memory intensity is mentioned: [Pg.382]    [Pg.303]    [Pg.121]    [Pg.205]    [Pg.333]    [Pg.37]    [Pg.252]    [Pg.2929]    [Pg.208]    [Pg.233]    [Pg.150]    [Pg.216]    [Pg.124]    [Pg.562]    [Pg.451]    [Pg.303]    [Pg.258]    [Pg.393]    [Pg.96]    [Pg.96]    [Pg.694]    [Pg.2079]    [Pg.210]    [Pg.87]    [Pg.681]    [Pg.43]   
See also in sourсe #XX -- [ Pg.252 ]




SEARCH



© 2024 chempedia.info