Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Computer reliability

The success of Primal-Dual interior-point methods is due to its feature of computing reliable and highly precise solutions in a guaranteed time framework, although its computational cost can become prohibitively expensive for large-scale SDP problems. [Pg.115]

It is easy to appreciate why electronic structure calculations that are accurate enough to compute reliably the geometries, spectra, and relative energies of RIs and to predict their reactivities, would be a very important adjunct to experiments. Fortunately, during the last three decades of the twentieth century, advances in both computer hardware and software began to make it possible to perform electronic structure calculations on a wide variety of RIs with the required accuracy. [Pg.962]

The measured electronic structure, occupied or unoccupied, provides the fullest information when also combined with theory. Electronic structure calculations in surface chemistry have advanced immensely in the past decades and have now reached a level of accuracy and predictive power so as to provide a very strong complement to experiment. Indeed, the type of theoretical modeling that will be employed and presented here can be likened to computer experiments, where it can be assumed that spectra can be computed reliably and thus computed spectra for different models of the surface adsorption used to determine which structural model is the most likely. In the present chapter, we will thus consistently use the interplay between experiment and theory in our analysis of the interaction between adsorbate and substrate. Before discussing what quantities are of interest to compute in the analysis of the surface chemical bond, we will briefly discuss and justify our choice of Density Functional Theory (DFT) as approach to spectrum and chemisorption calculations. [Pg.61]

The thermochemistry of molecules is of major importance in the chemical sciences and is essential to many technologies. Recent advances in theoretical methodology, computer algorithms, and computer power now allow us to compute reliable thermochemical data for a fairly wide variety of molecules. In this chapter we have reviewed the current state of the art in computational thermochemistry. [Pg.201]

Shooman M (1973) Operational testing and software reliability estimation during program development, Proc. Int. Symp. Computer Reliability, pp 51-57... [Pg.30]

Process data can be correlated or de-trended using a variety of functions. A computationally reliable approach is the use of orthogonal functions. Least squares fit of data with a simple reduced order function can provide valuable information about process performance in terms of dominant contributions to variability. [Pg.258]

It is not necessary to explain the significance of being able to compute reliable transition probabilities. These quantities are at the heart of spectroscopy. [Pg.55]

In the case where the PES can be computed reliably only with expensive calculations, it may not be possible to run many trajectories. One is thus forced to assume that the most interesting part of the PES is close to the MEP and start the dynamics at the FC structure with no initial kinetic energy at all. In the literature such a strategy is sometimes called reaction path dynamics. It is also possible to start the dynamics with a geometry that is slightly distorted. [Pg.100]

As noted above, the non-ground state properties of these materials (e.g. the optical band gap and electrical conductivity) are both difficult to compute reliably and of primary importance to our understanding of these systems. These properties remain an area of much current research. [Pg.216]

These two facts, namely weak binding and resonances with extremely small widths, constitute serious tests for the theory and the methods of locating resonances, and of identifying and computing reliably the main... [Pg.219]

Finally, needless to add, I point out that the same CESE approach can be applied for the calculation of the details of the resonance spectra corresponding to various thresholds of He, L/+, etc. In fact, such calculations would be much simpler than the ones in H, since the presence of the Coulomb attraction would reduce the conceptual and computational requirements. Indeed, except for very high-lying DESs (see Sections 9.4 and 9.5), a percentage of the resonances in He, L/+, etc., below the low-lying thresholds can also be computed reliably by methods such as the CCR and standard basis sets. [Pg.221]

With the case for computer use made, some cautions are in order. Computers are still bulkier than simple pencil-and-paper checklists. Computer reliability is not perfect, so inadvertent data loss is still a real possibility. Finally, software and hardware date much more rapidly than hard copy, so results safely stored on the latest media may be unreadable 10 years later. How many of us can still read punched cards or eight-inch floppy disks In contrast, hard-copy records are still available from before the start of the computer era. [Pg.1137]

Answer by Author A criterion for NPSH computation reliability is its repeatability when using a wealth of data. A three-thermocouple redundancy per measurement and the multiplicity of test runs pinpointed the temperatures selected. Then, by using equivalent vapor pressures, and by judiciously establishing pump inlet head losses, no erroneously-negative values of NPSH were realized. Since related suction specific speed values were neither imaginary nor infinite, and, because no hypothetical approaches were necessary to calculate their real values, the NPSH values were deemed reliable. [Pg.528]

Hosam, M.F. AboEIFotoh, S.S. Iyengar and Krishnendu Chakrabarty (2005). Computing reliability and message delay for cooperative wireless distributed sensor networks subject to random failures. IEEE Transactions on Reliability, vol.54, no.l, p 145 155. [Pg.1568]

The X-29 test vehicle demonstrated that an aircraft could be built to be statically unstable and yet maintain stable flight. The flight control computer reliably provides rapid updates of control surface actuators to compensate for disturbances before they ampUfy. Modern fighter aircraft are marginally unstable but use control systems to augment stability, thereby increasing maneuver performance. [Pg.12]

Examples of probabilistic response analysis using the mean-centred First-Order Second-Moment (FOSM) approximation, time-invariant (First- and Second-Order Reliability Methods, FORM and SORM) and time-variant (mean outcrossing rate computation) reliability analyses are provided to illustrate the methodology presented and its current capabilities and limitations. [Pg.22]

In the case of the hydrogenation of 26 the transition state for the oxidative addition of H2 in the unsaturated pathway was computed to be 7.2 kcal/mol less stable than the transition state of the double bond coordination. This value only slightly decreases at 173 K, hence the unsaturated pathway is not interfering with the dihydride route in this case at any temperature. Therefore, the optical yield of the catalytic hydrogenation of 2a catalyzed by 5 is determined solely by the value of AG, which was computed to be virtually the same at 298 K and at 173 K. Although the AG = P in this case is notably higher than computed AG, the computations reliably reproduce R-enantioselective reaction with the optical yield of about 96% ee which is not affected by the temperature changes (see also below). [Pg.45]

This standardization approach consists of transferring the calibration model from the calibration step to the prediction step. This transferred model can be applied to new spectra collected in the prediction step in order to compute reliable predictions. An important remark is that the standardization parameters used to transfer calibration models are exactly the same as the ones used to transfer NIR spectra. Some standardization methods based on transferring spectra yield a set of transfer parameters. For instance, the two-block PLS algorithm yields a transfer matrix, and each new spectrum collected in the prediction step is transferred by simply multiplying it by the transfer matrix. For these standardization methods, the calibration model can be transferred from the calibration step to the prediction step using the same transfer matrix. It should be pointed out that all standardization methods yielding a transfer matrix (direct standardization, PDS, etc.) could be used in order to transfer the model from the calibration to the prediction step. For instrument standardization, the transfer of a calibration model from the master instrument to the slave instruments enables each slave instrument to compute its own predictions without systematically transferring the data back to the master instrument. [Pg.239]

An important feature of the first-order reliability method is the facility to compute reliability sensitivity measures with respect to any set of desired parameters. The simplest such measure is the unit vector a = - VyG/l VyG I computed at the linearization point y, which represents the sensitivity of /3 with respect to variations in the linearization point [1,22], i.e.,... [Pg.88]

It will become evident in later sections that the nature of the weak noncovalent interactions in a cluster dictate which computational methods will produce accurate results. In particular, it is far more difficult to compute reliable properties for weakly bound clusters in which dispersion is the dominant attractive component of the interaction. For example, Hartree-Fock supermolecule computations are able to provide qualitatively correct data for hydrogen-bonded systems like (Fi20)2 even with very small basis sets, but this approach does not even bind Ne2- What is the origin of this inconsistency Dispersion is the dominant attractive force in rare gas clusters while the electrostatic component tends to be the most important attractive contribution near the equilibrium structure (H20)2- As London s work demonstrated,dispersion interactions are inherently an electron correlation problem and, consequently, cannot be described by Flartree-Fock computations. To this day, dispersion interactions continue to pose a significant challenge in the field of computational chemistry, particularly those involving systems of delocalized n electrons." ... [Pg.45]

However, according to Harrington and Zimm (40, 42), it is difficult to compute reliable degradation kinetics because of the process complexity. [Pg.46]

In this paper we propose to use the Probabilistic Continuous Constraints framework to deal with reliability assessment problems. Given its grounding on continuous constraint solving, this framework computes safe bounds for the reliability of series and parallel systems, contrary to classical approaches. The various kinds of approximations used by these approaches may turn the computed reliability value of little practical use, since they do not provide any bounds to the errors incurred. This is particularly significant in systems modeled by means of nonlinear constraints. [Pg.2276]


See other pages where Computer reliability is mentioned: [Pg.1147]    [Pg.36]    [Pg.22]    [Pg.83]    [Pg.705]    [Pg.286]    [Pg.113]    [Pg.159]    [Pg.371]    [Pg.130]    [Pg.234]    [Pg.150]    [Pg.68]    [Pg.49]    [Pg.241]    [Pg.206]    [Pg.27]    [Pg.219]    [Pg.59]    [Pg.1176]    [Pg.68]    [Pg.37]    [Pg.299]    [Pg.447]    [Pg.132]    [Pg.2]    [Pg.2176]   


SEARCH



© 2024 chempedia.info