Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Data sources, accuracy

Since the accuracy of experimental data is frequently not high, and since experimental data are hardly ever plentiful, it is important to reduce the available data with care using a suitable statistical method and using a model for the excess Gibbs energy which contains only a minimum of binary parameters. Rarely are experimental data of sufficient quality and quantity to justify more than three binary parameters and, all too often, the data justify no more than two such parameters. When data sources (5) or (6) or (7) are used alone, it is not possible to use a three- (or more)-parameter model without making additional arbitrary assumptions. For typical engineering calculations, therefore, it is desirable to use a two-parameter model such as UNIQUAC. [Pg.43]

The quality of data entering the LCA study is to be determined in view of temporal, spatial, technological, data sources (it must be determined whether primary data required or secondary data can be used), their accuracy etc. It concerns the determination of all requirements for the input data [5]. [Pg.268]

In the data collection, a literature search was conducted to identify data source publications (1-40). The publications were screened and copies of appropriate data were made. These data were then keyed into the computer to provide a data base of critical properties for compounds for which experimental data are available. The data base also served as a basis to check the accuracy of the estimation methods. [Pg.1]

The reliance that can be placed in a computer system is fundamentally determined by the integrity of the data it processes. It must be recognized that data accuracy is absolutely vital in the business context. However well an application works, it will be fimdamentally undermined if the data it processes is dubious. Data load is a key task that must be adequately managed to satisfy business and regulatory needs. Loading of data can be broken down into five basic steps data sourcing, data mapping, data collection, data entry, and data verihcation. [Pg.261]

The accuracy of calculated equilibrium states depends critically on the data sources used. Accurate predictions of heat capacities are often... [Pg.10]

The sea floor is represented by model cells, where the normal horizontal and vertical flux from the bottom into the cells is set to zero. This requiies a data set with information on the Baltic Sea bathymetry. In the late 1980s, at the outset of three-dimensional modeling of the Baltic Sea, gridded topographic data with sufficient accuracy and resolution were constructed from nautical maps, data from research cruises, and other data sources. One of the most complete data sets, used today for many Baltic Sea model projects as a standard, was compiled and regularly updated by Seifert et al. (2001), see Section 20.2.5. [Pg.590]

Authoritative Databases. Efforts to ensure the accuracy of the information in Risk Assistant databases involve both the selection of data sources and Quality Assurance procedures for data entry. As noted below, an effort was made to locate the most authoritative source for each database. Database entries are repeatedly checked against original sources. Finally, the user is supplied with a citation of the original literature source, and so is able to confirm database contents if necessary. [Pg.192]

The data sets available at the outset of an impact assessment are mostly of the first type. However, the environmental impact assessor will be guided to a certain extent in the selection of data sets by knowledge of the physical, biological, social and/or economic systems they are studying. Conversely, however, the data sources available within a region will influence the nature of the perceptual models used in the assessment. Where there are few data, the analysis wiU not include much detail. Supplementary data collected during the impact assessment should preferably be of the second type. The data should be sufficient to enable the prediction of an impact to be made within specified confidence Umits. The amoimt to be collected, the frequency, precision, accuracy, and type are dependent upon the known variability of the element in space and time. Where the variability is unknown, it must be determined by a pilot study. [Pg.8]

Monitor Indicator. Once the indicator is defined, the data-acquisition process identifies the data sources and data elements. As these data are gathered, they must be validated for accuracy and completeness. Multiple indicators can be used for data validation and cross-checking. The use of a computerized database allows rapid access to these data. A database management tool allows quick sorting and organization of these data. Once gathered, these data must be presented in a format suitable for evaluation. Graphic presentation of data allows rapid visual analysis for thresholds, trends, and patterns. [Pg.806]

The regression procedure was applied simultaneously to all the data of each material, regardless of the data sources. Thus, the results are not based on the data of only one author and, consequently, they are of higher accuracy and general applicability. [Pg.102]


See other pages where Data sources, accuracy is mentioned: [Pg.1969]    [Pg.1969]    [Pg.534]    [Pg.430]    [Pg.452]    [Pg.656]    [Pg.184]    [Pg.6]    [Pg.199]    [Pg.129]    [Pg.451]    [Pg.613]    [Pg.1720]    [Pg.57]    [Pg.24]    [Pg.34]    [Pg.340]    [Pg.1714]    [Pg.314]   


SEARCH



Data accuracy

Data sources

© 2024 chempedia.info