Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Information-intensive systems operations

An expert system shell developed in the MYCIN project is EMYCIN, which was used to develop other expert systems. One of these systems is PUEE, designed for the domain of heart disorders. Another outcome was the ventilator manager (VM) program developed as a collaborative research project between Stanford University and Pacific Medical Center in San Francisco within the scope of a Ph.D. thesis by Lawrence M. Fagan [6]. VM was designed to interpret on-line quantitative data in the intensive care unit. The system measures the patient s heart rate, blood pressure, and the status of operation of a mechanical ventilator that assists the patient s breathing. Based on this information, the system controls the ventilator and makes necessary adjustments. [Pg.175]

The continuous enrichment monitor (CEMO) monitors the absence of HEU production in selected gaseous centrifuge facilities and delivers qualitative Go-No-Go information to the inspectorate (Packer 1991 Packer et al. 1997). The CEMO determines the content of U-235 from the intensity of the 186 keV peak using Nal detectors fixed on the product header pipes and correlating the pressure of the gaseous UFs with a transmission measurement (radioactive X-ray transmission source ° Cd ( 20-50 keV)). The enrichment is then calculated based on these two parameters. The system operates continuously and transmits remotely (twice a day) state of health and alarm messages in the event that LEU is not confirmed or when a system fault occurs. [Pg.2931]

To minimize the time needed to make a decision on the response to choose in a detection-based defensive strategy, it is necessary to have contingency plans in place for responding appropriately to the alarm situations likely to be encountered. These plans should include an array of options of graduated intensity keyed to the quality of information available. They should include emergency changes to the operation of the HVAC system, evacuation of potentially... [Pg.32]

The information in this chapter applies specifically to the first element sample preparation. The sample preparation steps are usually the most tedious and labor-intensive part of an analysis. By automating the sample preparation, a significant improvement in efficiency can be achieved. It is important to make sure that (1) suitable instrument qualification has been concluded successfully before initiation of automated sample preparation validation [2], (2) the operational reliability of the automated workstation is acceptable, (3) the analyte measurement procedure has been optimized (e.g., LC run conditions), and (4) appropriate training in use of the instrument has been provided to the operator(s). The equipment used to perform automated sample preparation can be purchased as off-the-shelf units that are precustomized, or it can be built by the laboratory in conjunction with a vendor (custom-designed system). Off-the-shelf workstations for fully automated dissolution testing, automated assay, and content uniformity testing are available from a variety of suppliers, such as Zymark (www.zymark.com) and Sotax (www.sotax.com). These workstations are very well represented in the pharmaceutical industry and are all based on the same functional requirements and basic principles. [Pg.68]

LC-ISP-MS has been also successfully applied for the assay of 21 sulfonamides in salmon flesh (121). Separation was achieved in a reversed-phase LC system with gradient elution. Simple positive-ion spectra with an intense protonated molecule and no fragment ions of relevant abundance were displayed by all analytes by operating in the full-scan acquisition and SIM modes. Further application of tandem MS using SRM for increased sensitivity could overcome the lack of structural information presented by the ISP mass spectra. [Pg.736]

Second, any CIDNP based assignments concerning the sign of hfcs are valid only if the radical pair mechanism (RPM) [93-96] is operative they become invalid if the alternative triplet-Overhauser mechanism (TOM), based on electron nuclear cross relaxation [97-100] is the source of the observed effects. For effects induced via the TOM the signal directions depend on the mechanism of cross relaxation and the polarization intensities are proportional to the square of the hfc. Thus, they do not contain any information related to the signs of the hfcs. However, the TOM requires the precise timing of four consecutive reactions and, thus, is not very likely. In fact, this mechanism has been positively established in only two systems [98-100]. [Pg.147]

The manufacturer provides specifications for the performance of the OMA, including linearity of counts as a function of intensity and the geometric distortion and channel-to-channel crosstalk of the vidicon. However, the user needs methods for verifying the performance of his OMA. operating with his system. This information is needed both for designing correction schemes if distortion is found and for determining whether further improvements in the system are needed. Ideally the methods should be simple so that the testing can be performed on a routine basis. [Pg.324]

The study of elementary reactions for a specific requirement such as hydrocarbon oxidation occupies an interesting position in the overall process. At a simplistic level, it could be argued that it lies at one extreme. Once the basic mechanism has been formulated as in Chapter 1, then the rate data are measured, evaluated and incorporated in a data base (Chapter 3), embedded in numerical models (Chapter 4) and finally used in the study of hydrocarbon oxidation from a range of viewpoints (Chapters 5-7). Such a mode of operation would fail to benefit from what is ideally an intensely cooperative and collaborative activity. Feedback is as central to research as it is to hydrocarbon oxidation Laboratory measurements must be informed by the sensitivity analysis performed on numerical models (Chapter 4), so that the key reactions to be studied in the laboratory can be identified, together with the appropriate conditions. A realistic assessment of the error associated with a particular rate parameter should be supplied to enable the overall uncertainty to be estimated in the simulation of a combustion process. Finally, the model must be validated against data for real systems. Such a validation, especially if combined with sensitivity analysis, provides a test of both the chemical mechanism and the rate parameters on which it is based. Therefore, it is important that laboratory determinations of rate parameters are performed collaboratively with both modelling and validation experiments. [Pg.130]

Data-dependent acqnisition (DDA) is a mode of operation, where the MS experiment performed in a particular scan is based on the data acqnired in a previons scan. In a simple form, a DDA experiment switches the instrument from full-scan MS acquisition to full-scan product-ion MS-MS when the total-ion intensity or a selected-ion intensity exceeds a preset threshold. This avoids the need to perform two consecutive injections for the identification of unknowns in a mixture first to obtain the m/z values for the intact protonated molecules of the unknowns, and second to acquire the product-ion MS-MS spectra of these unknowns in a time-scheduled procedure, switching between various preselected precursor ions as a function of the chromatographic retention time. The DDA was promoted by Thermo Finnigan upon the introduction of the API-ion trap combinations [44-46]. Similar procedures are available for other commercial ion-trap systems, as well as for triple-quadrupoles, e.g.. Information Dependent Acquisition (IDA) from Applied Biosystems MDS Sciex, Data-directed Analysis (DDA) from Waters, and Smart Select from Bruker. [Pg.39]

These systems are mounted and operated from an unmodified UH-60 Blackhawk helicopter platform and ground vehicles. The LR-BSDS system provides information about cloud configuration (size, shape, and relative intensity) and cloud location (range, width, height, height above ground, and drift rate). A two-operator crew allows for human discrimination between manmade and naturally occurring aerosol clouds. [Pg.177]

One of the first functions supported by laboratory information systems (LISs) was the printing of summary laboratory data for inclusion in the patient s chart. This replaced the labor-intensive and error-prone process of manually writing or typing results on individual slips of paper. Other important intralaboratory functions included assigning accession numbers and tracking specimens and workload. Information system functions have expanded to include virtually every area of laboratory operation. ... [Pg.478]

Since for the photon-number states the expectation values of the field operators vanish, all the information about the state of the system is contained in the intensities of the corresponding fields... [Pg.87]


See other pages where Information-intensive systems operations is mentioned: [Pg.418]    [Pg.631]    [Pg.23]    [Pg.201]    [Pg.1348]    [Pg.341]    [Pg.341]    [Pg.346]    [Pg.234]    [Pg.275]    [Pg.74]    [Pg.146]    [Pg.254]    [Pg.303]    [Pg.208]    [Pg.113]    [Pg.56]    [Pg.132]    [Pg.234]    [Pg.138]    [Pg.107]    [Pg.144]    [Pg.185]    [Pg.17]    [Pg.275]    [Pg.150]    [Pg.70]    [Pg.448]    [Pg.209]    [Pg.112]    [Pg.4]    [Pg.395]    [Pg.449]    [Pg.2190]    [Pg.77]    [Pg.200]    [Pg.203]    [Pg.106]   
See also in sourсe #XX -- [ Pg.209 ]




SEARCH



Information intensity

Information system

Information-intensive systems

Operating system

Operations operating system

Operators intensive

System operation

© 2024 chempedia.info