Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Data Quality and Quantity

Qualitative and quantitative acceptance criteria for the PARCC parameters are derived in the planning phase. Whether they are specific statistical values or represent accepted standards and practices, they must be always selected based on the project objectives and be appropriate for the intended use of the data. The DQI acceptance criteria are documented in the SAP and serve as standards for evaluating data quality and quantity in the assessment phase of data collection process. The primary DQIs are established through the analysis of field and laboratory QC samples and by adhering to accepted standards for sampling and analysis. [Pg.39]

The number of data sets and the quahty of the data sets available per scenario varies greatly. The PHED Surrogate Exposure Guide (1998) characterizes the data set for each scenario from low to high confidence, based on data quality and quantity. [Pg.177]

Data quality and quantity are important issues when addressing the limitations of the existing calculated log P models. The amount of data for log P prediction is one of the largest in the field. The MedChem database contains the largest commercially available collection, with over 60 000 measurements of log P and... [Pg.244]

The extent of the survey is based on regional studies, with the requirements for data quality and quantity set forth in Refe [1,6]. The investigation is site-specific and covers an area within an approximately 50 km radius of the site. This area may be extended to compensate for lack of data in the time record (see Section 6.2). It may be smaller if the area is not populated and possible causes of events do not exist. The record length to be considered for site-specific evaluation is chosen with reference to the return period selected for the design basis. Appropriate extrapolation techniques have to be applied and validated. The projected growth of population around the site during the lifetime of the facility is evaluated. [Pg.37]

Since the accuracy of experimental data is frequently not high, and since experimental data are hardly ever plentiful, it is important to reduce the available data with care using a suitable statistical method and using a model for the excess Gibbs energy which contains only a minimum of binary parameters. Rarely are experimental data of sufficient quality and quantity to justify more than three binary parameters and, all too often, the data justify no more than two such parameters. When data sources (5) or (6) or (7) are used alone, it is not possible to use a three- (or more)-parameter model without making additional arbitrary assumptions. For typical engineering calculations, therefore, it is desirable to use a two-parameter model such as UNIQUAC. [Pg.43]

Let me hasten to add here that EPA is not encouraging companies to submit PMNs devoid of data. Quite the contrary. The quality of our risk assessment of a new chemical is directly related to quality and quantity of the health and environmental information we receive from the submitter or are able to obtain from the literature and all under the pressure of a 90 day time limit. Industry understands our position and is responding very well to meet our needs. We are receiving more pertinent data on new chemicals and especially from those companies coming in with... [Pg.17]

The following discussion will focus upon three areas of importance to successful utilization of an ICP. The three specific areas discussed are nebulizer design, spectral rejection and computer processing capability and have significant influence upon the quality and quantity of data obtained from an ICP-AES. These three features will significantly contribute to the analytical sample rate and the accuracy of the data. [Pg.116]

Macromolecular diffraction data are rarely of sufficient quality and quantity to allow construction of atomic models that would obey basic stereochemistry just based on this optimization of parameters to data. The observation-to-parameter ratio is a key factor in optimization procedures. For the optimization of a crystallographic model. [Pg.161]

Huang S-M, Lesko LJ, Williams RL. Assessment of the quality and quantity of drug-drug interaction studies in NDA submissions study design and data analysis issues. J Clin Pharmacol 1999 39 1006-1014. [Pg.272]

The safety factor is a number that reflects the degree or amount of uncertainty that must be considered when experimental data are extrapolated to the human population. When the quality and quantity of dose-response data are high, the uncertainty factor is low when the data are inadequate or equivocal, the uncertainty factor must be larger... [Pg.681]

Chemical kinetic models require as a minimum thermodynamic and reaction-specific information. If problems involve transport, also proper transport coefficients are necessary. Since the accuracy of a kinetic model is often associated specifically with the chemical reaction mechanism, it is important to note that also the thermodynamic data are essential for the reliability of predictions. Fortunately the quality and quantity of data on thermochemistry of species and on the kinetics and mechanisms of individual elementary reactions have improved significantly over the past two decades, because of advances made in experimental methods. This has facilitated considerably our ability to develop detailed chemical kinetic models [356],... [Pg.568]

We find the answers to the four questions in the course of the data quality assessment, which is the scientific and statistical evaluation of data to determine if data obtained from environmental data operations are of the right type, quality, and quantity to support their intended use (EPA, 1997a). Part of DQA is data evaluation that enables us to find out whether the collected data are valid. Another part of the DQA, the reconciliation of the collected data with the DQOs, allows us to establish data relevancy. Thus, the application of the entire DQA process to collected data enables us to determine the effect of total error on data usability. [Pg.8]

The EPA first introduced the DQO process in 1986 (EPA, 1986) and finalized it in 2000 (EPA, 2000a). The purpose of the DQO process is to provide a planning tool for determining the type, quality, and quantity of data collected in support of the EPA s decisions. Although developed specifically for projects under the EPA s oversight, the DQO process, being a systematic planning tool, is applicable to any projects that require environmental chemical data collection. [Pg.11]

Group D elements describe the procedures that will be used for the assessment of data quality and usability. Properly conducted laboratory data review, verification, and data validation establishes whether the obtained data are of the right type, quality, and quantity to support their intended use. [Pg.79]

Project implementation cannot be carried out without planning and it can be only as good as the planning itself. Only thorough and systematic planning provides a firm foundation for successful implementation. Poor planning typically leads to ineffectual implementation, which in turn does not produce the data of the type, quality and quantity required for reliable decision-making. [Pg.89]

The answer to the first question will establish the appropriateness of the collected data type and quantity or data relevancy, whereas the remaining three answers will establish the data quality or data validity. If the answers to all four questions are positive, the data may be confidently used for project decisions. A negative answer to any of them will reduce the level of confidence with which the data may be used or even make a data set unusable. [Pg.265]

Why do we need to perform DQA The need for DQA arises from the very existence of total error. The collection of planned data may go astray due to unforeseen field conditions, human errors or analytical deficiencies that may alter the type, quality, and quantity of the data compared to what has been planned. We use DQA as a tool that enables us to evaluate various components of total error and to establish their effect on the amount of valid and relevant data collected for each intended use. [Pg.265]

Why do we need to establish the data quality Why cannot we simply trust the field sampling crew and the laboratory to produce data of required quality according to the SAP s specifications Unfortunately, the world of field and laboratory operations is not ideal, and errors may go unnoticed and uncorrected. As various sampling and non-sampling errors erode the collected data quality, the quantity of usable data is... [Pg.266]

The EPA developed a document titled Guidance for Data Quality Assessment, Practical Methods for Data Analysis, EPA QA/G-9 (EPA, 1997a) as a tool for project teams for assessing the type, quality, and quantity of data collected for projects under the EPA oversight. This document summarizes a variety of statistical analysis techniques and is used primarily by statisticians. DQA, however, is not just a statistical evaluation of the collected data. It is a broad assessment of the data in the context of the project DQOs and the intended use of the data, which requires a... [Pg.282]

After the collected data have been evaluated and their quality and quantity established, the chemist will address the issue of data relevancy and usability. During data evaluation, the chemist relied on the DQIs acceptance criteria for direction on... [Pg.289]

Summary of deviations from the SAP and their effects on the collected data type, quality, and quantity... [Pg.295]

In general, the greatest resolution can be obtained in estimates of current or recent historical (within the last 10 to 15 years) emissions. This is because reliable data on fuel use (both quality and quantity) and other activity levels are available, and good estimates of emission coefficients and control efficiencies are available. As one goes further back in time, the data needed for detailed emission estimates are either not available or are less reliable. Recently, SOp and nitrogen oxide emissions were estimated for EPA for the period 1900 to 1980 at the state level by fuel and source sector (4J1). It was particularly difficult to obtain reliable estimates of pre-1940 fuel use and quality, control efficiency, and emission coefficients. Obviously, the less data that are available, the simpler the methodology that must be used. A discussion of a data set required for detailed analysis of emissions and deposition is beyond the scope of this paper, but is available elsewhere (6). [Pg.366]

Various pitfalls can occur when using a tiered system. In the ideal case, the estimated risk appears to be indeed less and less conservative as one increases the tiers. However, the quality and quantity of the available data influence the outcome of a tier, so that it may not always work out this way. Care should be taken in applying tiered systems. An option is to develop and apply methods that show the confidence interval in the outcomes next to the extrapolated values per se. In the case of highly... [Pg.320]

Though there is a clear gain in the quality and quantity of information when going from two- to three-way data sets, the mathematical complexity associated with the treatment of three-way data sets can seem, at first sight, a drawback. To avoid this problem, most of the three-way data analysis methods transform the original cube of data into a stack of matrices, where simpler mathematical methods can be applied. This process is often known as unfolding (see Figure 11.10). [Pg.441]


See other pages where Data Quality and Quantity is mentioned: [Pg.310]    [Pg.95]    [Pg.254]    [Pg.357]    [Pg.895]    [Pg.310]    [Pg.95]    [Pg.254]    [Pg.357]    [Pg.895]    [Pg.163]    [Pg.49]    [Pg.620]    [Pg.55]    [Pg.350]    [Pg.291]    [Pg.313]    [Pg.224]    [Pg.299]    [Pg.97]    [Pg.102]    [Pg.133]    [Pg.423]    [Pg.340]    [Pg.182]    [Pg.3]    [Pg.11]    [Pg.267]    [Pg.319]   


SEARCH



Data quality

Quantity data

© 2024 chempedia.info