Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Values assessments data sources

The principal tool for performance-based quality assessment is the control chart. In a control chart the results from the analysis of quality assessment samples are plotted in the order in which they are collected, providing a continuous record of the statistical state of the analytical system. Quality assessment data collected over time can be summarized by a mean value and a standard deviation. The fundamental assumption behind the use of a control chart is that quality assessment data will show only random variations around the mean value when the analytical system is in statistical control. When an analytical system moves out of statistical control, the quality assessment data is influenced by additional sources of error, increasing the standard deviation or changing the mean value. [Pg.714]

The data in Table 4.9 Table 4.12 are comprehensive estimates of five classes of flavonoids in commonly available foods in the United Kingdom. Moreover, these estimates are derived from critically assessed published sources and the evaluation procedures adopted ensured the inclusion of content values for edible parts of plant materials available to the UK consumer. A USDA compiled database (http //www.nal.usda.gov/fnic/foodcomp/Data/Flav/flav.pdf) aimed primarily at the North American diet is also available. These databases are in contrast to another literature-derived database that is available for flavonoids where data quality was not formally assessed and flavonoid values determined using semiquantitative methods were also included. [Pg.240]

The expression "Not Pertinent means that the data item either has no real meaning (such as the flash point of a inflammable chemical) or is not required for assessing a hazardous situation. The expression "Data Not Available" means that the information sought was not found in the general data sources consulted during the preparation of this handbook. In a few cases where important data were not available, values were estimated by usually reliable procedures all such values are labeled "(est.) . If more accurate values for those items are found, they will be included in later revisions. [Pg.3]

Reliability assessment (data quality) is based on information in the original source describing the method used (including evidence for calibration of pH meters exclusion of CO2 in determining pKg values above 6.5), whether pKa values for standard compounds were measured, presence of organic cosolvents, the presence or absence of corrections for [H" "], [OH ] in potentiometric titrations, and use of mean ionic activity coefficients in the calculations. Considerable effort has been made to locate the original source for each measured value. Where only secondary sources have been located, data reliability cannot be assessed with confidence. [Pg.50]

The physical chemical parameters of organic compounds are presently available in a variety of sources. They include comprehensive publications and on-line web sites that compile data for thousands of compounds. Other papers in the literature are restricted to either specihc physical chemical parameters and/or chemical classes. Some of these data bases are peer reviewed and one can depend on the values cited while others may cite references and the investigator has to draw his/her own conclusions. Some recent publications have chosen to summarize aU the data reported for say, the solubility of a compound, and critique values cited before indicating a preferred value. This approach is most useful to those interested in assessing the environmental behavior of compounds. Examples of these different data sources are given, but this list is by no means complete. [Pg.67]

The rate parameter is related to the expected failure time and can be estabhshed by reUabUity data. The value of the shape parameter can also be established by reliabihty data, but with stricter requirements to the form of data which must include information about component age and preferably cover the whole component history. As this information is often not available in generic rehabihty data sources like e.g. OREDA or EIReDa, the shape parameter is difficult to assess. In order to do it we need to know something about the maintenance program involving the ageing component of interest, the nature of the possible failure modes and... [Pg.1455]

During the mathematical specification of the simulation model there is no need to bother about the correct parameter values to be used during the Monte Carlo simulation. Of course, this is addressed prior to running the simulations. In principle there are three kinds of sources for parameter values. The ideal source would consist of sufficient statistical data that has been gathered under the various contextual conditions for which the risk assessment has to be performed. In practice such ideal sources almost never exist. Instead one typically has to work with limited statistical data that has been gathered under different conditions. Fortunately there often are two complementary sources domain expertise and scientific expertise (on safety and human factors). In the context of Monte Carlo... [Pg.60]

Up to this point, the detection Unfit and sensitivity comparisons of the different sources have focused primarily on compoimds that ionize efficiently with all the techniques. It is important to understand the coverage or scope of an ionization technique across the chemical space of general interest, particularly when confronted with unknown compounds, or compounds whose structures are known but whose ionization properties have not been tested, and there is no time to assess a variety of options. This situation occurs in many drug discovery laboratories measuring in vitro ADME properties (administration, distribution, metabolism, excretion) where many different chemical species need to be assayed quickly. The data used to generate the relative efficiency values within a source in Tables 1-3 were used to calculate the relative MRM efficiency between the three sources and are shown in Table 13.4. The MALDI data were acquired in the most practical fashion to obtain a quantitative measurement where only a small percentage of the sample spot was ablated with a single raster. The ESI and APCI data were obtained by flow injection analysis at 200 and lOOOpL/min, respectively. Electrospray is the most sensitive ion source in nearly all... [Pg.461]

Analytical information taken from a chromatogram has almost exclusively involved either retention data (retention times, capacity factors, etc.) for peak identification or peak heights and peak areas for quantitative assessment. The width of the peak has been rarely used for analytical purposes, except occasionally to obtain approximate values for peak areas. Nevertheless, as seen from the Rate Theory, the peak width is inversely proportional to the solute diffusivity which, in turn, is a function of the solute molecular weight. It follows that for high molecular weight materials, particularly those that cannot be volatalized in the ionization source of a mass spectrometer, peak width measurement offers an approximate source of molecular weight data for very intractable solutes. [Pg.335]

CCPS, 1989b, Process Equipment Reliability Data (Table 4.1-1) is a compilation of chemical and nuclear data. It assesses failure rates for 75 types of chemical process equipment. A taxonomic classification is established and data such as the mean, median, upper and lower (95% and 5%) values, source of information, failure by time and failure by demands are presented. [Pg.153]

The target level procedure was applied to 16 common air contaminants (Table 6.19). These are common contaminants in the industrial environment, and in many cases are the most critical compounds from the viewpoint of need for control measures. The prevailing concentration data as well as the benchmark levels were taken from Nordic databases, mainly the Finnish sources, and described elsewhere.In addition, a general model for assessing target values for other contaminants is presented in the table. [Pg.402]

The component failure rate data used as input to the fault tree model came from four basic sources plant records from Peach Bottom (a plant of similar design to Limerick), actual nuclear plant operating experience data as reported in LERs (to produce demand failure rates evaluated for pumps, diesels, and valves), General Electric BWR operating experience data on a wide variety of components (e.g., safety relief SRV valves, level sensors containment pressure sensors), and WASH-1400 assessed median values. [Pg.120]

Thus, tlie focus of tliis subsection is on qualitative/semiquantitative approaches tliat can yield useful information to decision-makers for a limited resource investment. There are several categories of uncertainties associated with site risk assessments. One is tlie initial selection of substances used to characterize exposures and risk on tlie basis of the sampling data and available toxicity information. Oilier sources of uncertainty are inlierent in tlie toxicity values for each substance used to characterize risk. Additional micertainties are inlierent in tlie exposure assessment for individual substances and individual exposures. These uncertainties are usually driven by uncertainty in tlie chemical monitoring data and tlie models used to estimate exposure concentrations in tlie absence of monitoring data, but can also be driven by population intake parameters. As described earlier, additional micertainties are incorporated in tlie risk assessment when exposures to several substances across multiple patliways are suimned. [Pg.407]

The main sources of error in charge density studies based on high-resolution X-ray diffraction data are of an experimental nature when special care is taken to minimise them, charge density studies can achieve an accuracy better than 1% in the values of the structure factor amplitudes of the simplest structures [1, 2]. The errors for small molecular crystals, although more difficult to assess, are reckoned to be of the same order of magnitude. [Pg.12]

The process of field validation and testing of models was presented at the Pellston conference as a systematic analysis of errors (6. In any model calibration, verification or validation effort, the model user is continually faced with the need to analyze and explain differences (i.e., errors, in this discussion) between observed data and model predictions. This requires assessments of the accuracy and validity of observed model input data, parameter values, system representation, and observed output data. Figure 2 schematically compares the model and the natural system with regard to inputs, outputs, and sources of error. Clearly there are possible errors associated with each of the categories noted above, i.e., input, parameters, system representation, output. Differences in each of these categories can have dramatic impacts on the conclusions of the model validation process. [Pg.157]

Most of the force fields described in the literature and of interest for us involve potential constants derived more or less by trial-and-error techniques. Starting values for the constants were taken from various sources vibrational spectra, structural data of strain-free compounds (for reference parameters), microwave spectra (32) (rotational barriers), thermodynamic measurements (rotational barriers (33), nonbonded interactions (1)). As a consequence of the incomplete adjustment of force field parameters by trial-and-error methods, a multitude of force fields has emerged whose virtues and shortcomings are difficult to assess, and which depend on the demands of the various authors. In view of this, we shall not discuss numerical values of potential constants derived by trial-and-error methods but rather describe in some detail a least-squares procedure for the systematic optimisation of potential constants which has been developed by Lifson and Warshel some time ago (7 7). Other authors (34, 35) have used least-squares techniques for the optimisation of the parameters of nonbonded interactions from crystal data. Overend and Scherer had previously applied procedures of this kind for determining optimal force constants from vibrational spectroscopic data (36). [Pg.173]

Although basic scientific research deals with variation and its sources, the results of research work are mainly described and compared in terms of mean values supplemented with information about whether specific factors have a significant impact or not. Most research studies focus on individual factors in isolation and there are limited data in the literature on the interaction of a number of factors, particularly in relation to on-farm production practice. Consequently, the meaningfulness of previous results is often limited and often does not allow general conclusions to be drawn. As the relevance of the various factors changes between different production systems it is even more difficult to assess the ranking position of each factor within each production system in relation to the variation of product and process quality traits. [Pg.147]

It is hoped that this new edition of the handbook will be of value to environmental scientists and engineers and to students and teachers of environmental science. Its aim is to contribute to better assessments of chemical fate in our multimedia environment by serving as a reference source for environmentally relevant physical-chemical property data of classes of chemicals and by illustrating the likely behavior of these chemicals as they migrate throughout our biosphere. [Pg.923]

The incident flux used in the NBS smoke chamber is only a single value, at 2.5 w/cm2, which is a relatively mild flux for a fire, and cannot, thus represent all the facets of a fire. The light source is polychromatic, which causes problems of soot deposits and optics cleaning, as compared to measurements done using a monochromatic (laser) beam. Finally, the units of the normal output of this smoke chamber are fairly arbitrary and the data is of little use in fire hazard assessment. [Pg.524]

Mass spectrometric measurements coupled with solution thermochemical results are the sources of solvation enthalpy values for anions and cations. These data are related to the lattice energy, which is a parameter used to assess the ionic character of solids and predict their standard enthalpies of formation. An introduction to that... [Pg.26]

Finally, the MOS should also take into account the uncertainties in the estimated exposure. For predicted exposure estimates, this requires an uncertainty analysis (Section 8.2.3) involving the determination of the uncertainty in the model output value, based on the collective uncertainty of the model input parameters. General sources of variability and uncertainty in exposure assessments are measurement errors, sampling errors, variability in natural systems and human behavior, limitations in model description, limitations in generic or indirect data, and professional judgment. [Pg.348]


See other pages where Values assessments data sources is mentioned: [Pg.274]    [Pg.37]    [Pg.57]    [Pg.278]    [Pg.19]    [Pg.274]    [Pg.18]    [Pg.59]    [Pg.10]    [Pg.384]    [Pg.4]    [Pg.4]    [Pg.4]    [Pg.4]    [Pg.121]    [Pg.334]    [Pg.100]    [Pg.59]    [Pg.66]    [Pg.501]    [Pg.521]    [Pg.413]    [Pg.83]    [Pg.297]    [Pg.32]    [Pg.253]    [Pg.273]    [Pg.124]   
See also in sourсe #XX -- [ Pg.241 ]




SEARCH



Data assessment

Data sources

Source assessment

Value assessment

© 2024 chempedia.info