Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

General Data

Selection of Solubility Data Solubility values determine the liquid rate necessaiy for complete or economic solute recoveiy and so are essential to design. Equihbrium data generally will be found in one of three forms (1) solubility data expressed either as solubility in weight or mole percent or as Heniy s-law coefficients, (2) pure-component vapor pressures, or (3) equilibrium distribution coefficients (iC values). Data for specific systems may be found in Sec. 2 additional references to sources of data are presented in this section. [Pg.1351]

This can also be derived for a combined / , L and C circuit to obtain more accurate data. Generally, the figure obtained through equation (18.4) is simpler, quicker and provides almost correct information for the purpose of surge analysis, and is used more in practice. Where, however, more accurate data are necessary, such as for academic interest, then the more relevant formulae may be used. [Pg.597]

Productivity is addressed under Continuous improvement and in order to improve productivity you will need to collect data generally in the form of resource/part produced. Resource can be hours, costs, weight, or volume of material consumed. Graphs showing the productivity trend over time for plants, products, and processes would satisfy this requirement. [Pg.144]

Because most research effort in the human reliability domain has focused on the quantification of error probabilities, a large number of techniques exist. However, a relatively small number of these techniques have actually been applied in practical risk assessments, and even fewer have been used in the CPI. For this reason, in this section only three techniques will be described in detail. More extensive reviews are available from other sources (e.g., Kirwan et al., 1988 Kirwan, 1990 Meister, 1984). Following a brief description of each technique, a case study will be provided to illustrate the application of the technique in practice. As emphasized in the early part of this chapter, quantification has to be preceded by a rigorous qualitative analysis in order to ensure that all errors with significant consequences are identified. If the qualitative analysis is incomplete, then quanhfication will be inaccurate. It is also important to be aware of the limitations of the accuracy of the data generally available... [Pg.222]

The calculation of the proton affinities (PA) for a pair of tautomers and the comparison with experimental data [generally from ICR measurements (Section VII,F)] has been the subject of a series of publications with increasing sophistication (Table IV). Such calculations concerning the annular tautomerism of azoles and benzazoles have been reviewed [87AHC(41)187]. [Pg.19]

Kessler and Wankat [101] have examined several column performance parameters, and for O Connell s [49] data presented in Figure 8-29 they propose equations that reportedly fit the data generally within about 10% limits ... [Pg.44]

This concept has not gained commercial popularity due to the proprietary nature of the Fractionation Research, Inc. (FRI) data being limited to member organizations, and the public literature does not contain much independent research and application data. General industrial and commercial proprietary designs available are listed in Table 8-12, but may not be all-inclusive ... [Pg.122]

The effect of media viscosity on polymerization rates and polymer properties is well known. Analysis of kinetic rate data generally is constrained to propagation rate constant invarient of media viscosity. The current research developes an experimental design that allows for the evaluation of viscosity dependence on uncoupled rate constants including initiation, propagation and macromolecular association. The system styrene, toluene n-butyllithium is utilized. [Pg.375]

The ffg parameter scales of Table V appear to be applicable to substituent effects at the ortho and meta positions. For the latter position, however, the data generally are not capable of discriminating between Og scales (cf. earlier results and discussion). For the ortho position, the great difficulty is in obtaining a data set covering the full range of electronic properties without the incursion of substantial proximity effect (2a) contributions. In a following section are reported some of the results of treatment of ortho data sets by eq.(l). [Pg.58]

All calculations reported here were made by using the Time Series Modules of the MINITAB Statistical Package, running on a Data General MV8000 computer under the AOS/VS operating system. [Pg.92]

Data General MV/4000 + fpa AOS/VS f77 optimized CDC Cyber 205, Scalar Program VSOS/FTN-200 impl.vect. opt. CDC Cyber 205, Minimal Explicit Vectorization... [Pg.173]

Data processing was accomplished with the aid of a Data General ECLIPSE Minicomputer interfaced to a Data General NOVA 2 Minicomputer which, in turn, was interfaced to the GPCs. [Pg.150]

The HPGPC has been interfaced to a Data General NOVA Model... [Pg.208]

The determination and analysis of sensory properties plays an important role in the development of new consumer products. Particularly in the food industry sensory analysis has become an indispensable tool in research, development, marketing and quality control. The discipline of sensory analysis covers a wide spectrum of subjects physiology of sensory perception, psychology of human behaviour, flavour chemistry, physics of emulsion break-up and flavour release, testing methodology, consumer research, statistical data analysis. Not all of these aspects are of direct interest for the chemometrician. In this chapter we will cover a few topics in the analysis of sensory data. General introductory books are e.g. Refs. [1-3]. [Pg.421]

Adsorption-desorption Partly Mechanisms for adsorption on similar materials will be similar. Soil adsorption data generally do not reflect the saturated conditions of the deep-well environment. Organic-matter content is a major factor affecting adsorption in the near-surface its significance in the deep-well environment is less clear. Fate studies involving artificial recharge are probably useful, but differences between fresh waters and deep brines may reduce relevance. [Pg.793]

In 1979, the joint center for three international organizations (UNEP, FAO, and WHO) monitoring food product contamination started collecting relevant data. Generalized information covered the period from 1971 onward it was collected widely, from Australia, New Zealand and Japan to Great Britain, Canada, and the USA [79]. The Soviet Union did not feature on this list ... [Pg.76]

The Engineer and Formulator programs were written in FORTRAN 77 on a Data General MV/8000 running under AOS/VS. The Operator program was written in FORTRAN V on a Data General Model 10/SP using RDOS. [Pg.184]

Powder diffraction data, generally similar to that of Table 1, is also given. [Pg.291]

A theoretical model for the adsorption of metals on to clay particles (<0.5 pm) of sodium montmorillonite, has been proposed, and experimental data on the adsorption of nickel and zinc have been discussed in terms of fitting the model and comparison with the Gouy-Chapman theory [10]. In clays, two processes occur. The first is a pH-independent process involving cation exchange in the interlayers and electrostatic interactions. The second is a pH-dependent process involving the formation of surface complexes. The data generally fitted the clay model and were seen as an extension to the Gouy-Chapman model from the surface reactivity to the interior of the hydrated clay particle. [Pg.362]

Provost, A. Allegre, C. J. (1979). Process identification and search for optimal parameters from major-element data. General presentation with emphasis on the fractional crystallization process. Geochim. Cosmochim. Acta, 43, 487-501. [Pg.534]

The required data generally are obtained by administering a measured dose of the candidate compound -- often isotopically labelled -- to the rat or mouse either by injection or per os. The animal is housed in a glass metabolism "cage" where it receives food, water, and clean air, and its urine, feces, and respired gases are collected and examined for the parent chemical and its metabolites. Eventual postmortem tissue analysis and calculation of material balance complete the measurements necessary to satisfy the above purposes of metabolism and pharmacokinetic experiments. While in vitro biochemical studies are important adjuncts, it is also apparent that only experiments with intact, healthy, living animals will suffice to meet EPA criteria. [Pg.218]

Conceptually, SPMD data fills a gap between exposure assessments based on direct analytical measurement of total residues in water and air, and the analysis of residues present in biomonitoring organisms. SPMDs provide a biomimetic approach (i.e., processes in simple media that mimic more complex biological processes) for determining ambient HOC concentrations, sources, and gradients. Residues accumulated in SPMDs are representative of their environmental bioavailability (see Section 1.1.) in water and air and the encounter-volume rate as defined by Landrum et al. (1994) is expected to be proportional to the uptake rate. SPMD-based estimates of water concentrations can be readily compared to aquatic toxicity data (generally based on dissolved phase concentrations) and SPMD extracts can be used to screen for toxic concentrations of HOCs using bioassays or biomarker tests. [Pg.32]

Figure 3 provides a sample session with MR3LJOQ ESP. The controlling code is written in Prolog, while numerically oriented, analytical procedures for classification are written in Fortran-77, xhe system has been written to run under Data General s ADS/VS operating system (MV series computers) but it is expected to be easily ported to the Digital Equipment VAX VMS environment. [Pg.342]

I would like to thank Mr. R. H. Rabiner of Data General Corp. for the invaluble help and advice about the computer during the course of these experiments. [Pg.194]

For linear extrapolation, a line is drawn from the POD (from observed data), generally as a default, a LED (the 95% lower conhdence limit on a dose associated with an extra tumor risk) chosen to be representative of the lower end of the observed range, to the origin (zero dose/zero response), corrected for background incidences. This implies a proportional (linear) relationship between risk and dose at low doses (note that the dose-response curve generally is not linear at higher doses). The slope of this line, known as the slope factor, is an upper-bound estimate of risk per increment of dose that can be used to estimate risk probabihties for different exposure levels. The slope factor is equal to O.OI/LEDqi if the LEDqi is used as the POD. [Pg.309]


See other pages where General Data is mentioned: [Pg.201]    [Pg.212]    [Pg.309]    [Pg.335]    [Pg.32]    [Pg.126]    [Pg.251]    [Pg.73]    [Pg.107]    [Pg.581]    [Pg.106]    [Pg.594]    [Pg.563]    [Pg.9]    [Pg.150]    [Pg.347]    [Pg.63]    [Pg.346]    [Pg.177]    [Pg.110]    [Pg.146]    [Pg.189]    [Pg.92]    [Pg.259]    [Pg.210]   
See also in sourсe #XX -- [ Pg.303 ]

See also in sourсe #XX -- [ Pg.56 ]




SEARCH



Crystallographic data general observations

Data acquisition and use generally

Data bases exposure, general population

Data collection generally

Data collection systems, generally

GENERAL DESIGN DATA

General Chemistry Data Sheet

General Evaluation by Integration of Scattering Data

General Guidelines for Calibration Data Sets

General aspects, criteria and starting data

General commercial data systems

General data analysis

General data base structure

General input data for the MOREHyS model

General principles for evaluating the data

General principles of quantitative data analysis

NOTATION, AND GENERAL DATA

Phospholide Ions General Data

Some Issues Associated with a General Data Reconciliation Problem

Treatment of Data General Equation and Zimm Plot

© 2024 chempedia.info