Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Verifying the Data

The acquisition sequence is as follows a first acquisition calibration enables the acquisition operator to verify the data before storage. The row data, together with calibration files are transferred to the analysis program. The program transforms the row data into calibrated data, which is then analysed. [Pg.1008]

Knowledge acquisition subsystem The task of the knowledge acquisition subsystems is to assemble and upgrade the knowledge base. A major task is to verify the data and check for consistency. [Pg.479]

Actual emission data are available from many handbooks, government publications, and literature searches of appropriate research papers and journals. It is always wise to verify the data, if possible, as to the validity of the source and the reasonableness of the final number. Some emission factors, which have been in use for years, were only rough estimates proposed by someone years ago to establish the order of magnitude of the particular source. [Pg.94]

Low-limit alert The first alert (i.e., low-limit alerta) should be set at the lowest vibration amplitude that will be encountered from a normally operating machine-train. This value is needed to ensure that valid data are taken with the microprocessor. If this minimum amplitude is not reached, the system alerts the operator, who can retake or verify the data point. Low-limit selection is arbitrary, but should be set slightly above the noise floor of the specific microprocessor used to acquire data. [Pg.718]

Application of the test substance to the test system is without doubt the most critical step of the residue field trial. Under-application may be corrected, if possible and if approved by the Study Director, by making a follow-up application if the error becomes known shortly after the application has been made. Over-application errors can usually only be corrected by starting the trial again. The Study Director must be contacted as soon as an error of this nature is detected. Immediate communication allows for the most feasible options to be considered in resolving the error. If application errors are not detected at the time of the application, the samples from such a trial can easily become the source of undesirable variability when the final analysis results are known. Because the application is critical, the PI must calculate and verify the data that will constitute the application information for the trial. If the test substance weight, the spray volume, the delivery rate, the size of the plot, and the travel speed for the application are carefully determined and then validated prior to the application, problems will seldom arise. With the advent of new tools such as computers and hand-held calculators, the errors traditionally associated with applications to small plot trials should be minimized in the future. The following paragraphs outline some of the important considerations for each of the phases of the application. [Pg.155]

The sponsor should utilize appropriately qualified individuals to supervise the overall conduct of the trial, to handle the data, to verify the data, to conduct the statistical analyses, and to prepare the trial reports. ... [Pg.7]

In addition to testing the system components, a test of software functionality would be performed to test the system software operation and electronic records and electronic signatures (ERES) compliance (security, data integrity, data backup, and archive). In order to test the software functionality, a predetermined set of instructions can be entered step by step into the system. The system responses are then compared to the expected outcomes of the instruction and any problems with the execution are determined. Some vendors will provide a standard set of data which can be processed by the system to verify the data-handling capability of the system. [Pg.802]

A stepwise approach to DQA identifies different tasks that may be performed by individuals with different expertise. For example, a less experienced chemist may verify the data package content (Step 2), whereas a more experienced chemist may perform data evaluation (Step 3). For a statistical data collection design, a statistician may be involved in the assessment of data relevancy (Step 6). A database manager may be involved at several steps if the EDDs are part of laboratory deliverables and if completeness is calculated. [Pg.284]

A Digital Computer Network allows the users to place an impressive amount of computer power anywhere. In the research environment this allows the researcher to control the experiment, verify the data, extract and record at the experiment site. The dramatic reduction in the cost of data manipulation allows one to put many computers where ever they are needed. [Pg.47]

Maintain a confidential list identifying the number/code and names of all subjects entered into the study Allow authorized representatives of the sponsor/CRO and regulatory authorities direct access to study subject clinical notes (source documents) in order to verify the data recorded on CRFs Ensure CRFs are complete and accurate... [Pg.143]

Many compilations of thermodynamic data partially covering the materials of interest have been published in the nineties. Table 2.2 is based on data from these compilations. Most compilations report also thermodynamic data at elevated temperatures (usually every I OO C). No attempt to update the thermodynamic data using recent publications or to verify the data in the compilations using original publications has been made. The thermodynamic functions are interrelated. Each compilation has its unique style of presentation, namely some functions are given explicite, and the other ones must be calculated. Some values reported in the Table 2.2 below were calculated by means of procedures explained below. [Pg.52]

Electrical resistivity measurements were also taken at 28 locations selected across the site to verify the data collected by electromagnetic methods. The data obtained was in agreement with the results of the electromagnetic conductivity survey. [Pg.135]

Under the first U.S.-Russian bilateral agreement, signed in 1989, the two countries are to exchange data on chemical weapon stockpiles and facilities and verify the data. The second agreement (the Bilateral Destruction Agreement-BDA), signed in 1990, calls for the destruction... [Pg.11]

This work contains information obtained from authentic and highly regarded sources. Reprinted material is quoted with permission, and sources are indicated. A wide variety of references are listed. Best efforts have been made to select and verify the data on the basis of sound scientific judgment, but the author and the publisher cannot accept responsibility for the validity of all materials or for the consequences of their use. [Pg.4]

In the dissociation of ethanal,2-diol, CH(0H)2CH=0 to C (OH)2CH=0 + H, We observe a very small, (only 75.4 kcal mol ) RC—energy in the dihydroxy C—bond. This can be explained with resonance of the carbon radical with a carbonyl group. We have calculated this energy with the G3 method to verify the data and find a similar value, 75.1 kcal mol. We have not observed this effect with species containing only one hydroxy (OH) and the carbonyl and look to study on simple dihydroxy alkanes to fiirther, understand this effect of a second OH. [Pg.83]

It is clear that this type of data may not be readily aveiilable in the compemy database. Even if the data are accessible, they may not be in the required format, particulady if the DSS involves geographic display and analysis. As one might expect collecting, tabulating, and verifying the data can take some time. [Pg.2013]

Su et al. [24] studied the polymorphic transformation of n-mannitol by in situ Raman spectroscopy coupled with FBRM (focused beam reflectance measurement) and PVM (particle vision measurement). In this way, relationships between fine particles and metastable-form dissolution, and also between coarse particles and stable-form crystallization, could be defined. The different polymorphs were identified by Raman spectroscopy. FBRM provided a method for independently verifying these observations. PVM, in turn, verified the data interpretation strategy employed for FBRM. [Pg.45]

Identify a comprehensive set of critical benchmark data and, to the extent possible, verify the data by reviewing original and subsequently revised documentation, and by talking with the experimenters or individuals who are familiar with the experimenters or the experimental facility. [Pg.717]

Safety arguments for data provision should address the data origination, the tools used in the data supply chain and in the creation of the dataset. The use of a data supply chain requires that changes to the data architecture are rolled out across the data supply chain, the tools it employs and the criteria used to verify the data. [Pg.273]

Baumann (154) collected data from other sources for comparison with the data of Alexander plotted in Figure 1.6 these data seem to confirm that the solubility reaches a minimum at pH 7-8, but the reason for the slightly higher solubility at lower pH is not known. However, Cherkinskii and Knyaz kova (160) verified the data (see Figure 1.6) and proposed that silica is amphoteric and is cationic below pH 7. They give equations assuming that all the soluble silica is actually polymeric and cationic at low pH. There is no experimental basis for such a tljeory all of the soluble silica has been shown to be monomeric, and there is no evidence for cationic silica above the isoelectric point of pH 2. [Pg.43]

Designers and editors, forced to verify the data after output for correctness and manufacturability, began to work with CAD and computer-aided manufacturing (CAM) tool vendors and industry consortia to develop better formats and methods for data exchange. They sought a data exchange format that was explicit, inteUigent, optimized, and bidirectional ... [Pg.378]

It is important that factual information be carefuUy recorded. The individual coUecting data must take detaUed notes to accurately record the employee s account. Verify the data with the employee to ensure accuracy and to confirm that the impact is fuUy understood. Employees should not be interrupted with questions when describing the incident. Ask questions after the employee has finished. Do not attempt to explore the incident causes at this point. The focus should be exclusively on understanding what happened. [Pg.189]

During the operation of technological equipment, it is necessary, in order to evaluate its actual technical conditions, to collect data that characterize operational conditions, that is, to apply technical diagnostic methods in alignment with the latest methods of experimental measurement. Furthermore, it is important to verify the data measured as well as the data processing methodology to eliminate their irrelevance to the monitored subject. [Pg.162]

Our modelling allows us to formally define and verify the data freshness and integrity properties. We define them as model invariants as follows ... [Pg.63]


See other pages where Verifying the Data is mentioned: [Pg.832]    [Pg.145]    [Pg.171]    [Pg.49]    [Pg.952]    [Pg.352]    [Pg.344]    [Pg.34]    [Pg.452]    [Pg.1217]    [Pg.185]    [Pg.126]    [Pg.313]    [Pg.63]    [Pg.64]    [Pg.64]   


SEARCH



The Data

VERIFY

Verifiability

© 2024 chempedia.info