Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Verification of data

Verification of data format entered into the computer Worksheet generation... [Pg.516]

Two sources to obtain this necessary information are the use of data bases and through experimental determinations. Enthalpies of reaction, for example, can be estimated by computer programs such as CHETAH [26, 27] as outlined in Chapter 2. The required cooling capacity for the desired reactor can depend on the reactant addition rate. The effect of the addition rate can be calculated by using models assuming different reaction orders and reaction rates. However, in practice, reactions do not generally follow the optimum route, which makes experimental verification of data and the determination of potential constraints necessary. [Pg.116]

A subtle shift has taken place, however. Ten years ago a laboratory might have been (appropriately) cited for poor data protection practices. Now a citation might be issued for lack of an audit trail, even with evidence that the data have never been compromised. In effect the regulatory attention has shifted back a step, from verification of data to verification of data protection devices (audit trails). The FDA now requires an automated regulatory tool as a QA monitor in all but the most unautomated laboratory environments. [Pg.225]

The role of Performance Qualification, which entails challenging the application within the scope of business processes, is harder to distinguish. As a consequence PQ for databases may be combined with User Acceptance testing. Items for consideration within a PQ include verification of data management within the application (actually checking manipulated data sets to determine they are correct), and examining the role of the application within the wider process flow. [Pg.758]

Thus most of the data work consisted of checking already existing data, rather than collecting whole new sets of data. The relevant companies and installations were typically contacted twice (using a questionnaire for potentially affected installations) and asked to check their data. No legal objections to the use and verification of data have been raised. [Pg.118]

The inspection should cover the evaluation and assessment of the documentation, premises, equipment, utilities and materials. It should also cover verification of data and documentation such as results, batch records, compliance with SOP and information submitted on the manufacturing method, equipment and aspects including (but not limited to) validation of the manufacturing process, validation of utilities and support systems, and validation of equipment. [Pg.242]

A person appointed by, and responsible to, the sponsor or CRO for the monitoring and reporting of progress of the trial and for verification of data. [Pg.443]

DCD to Tooele County and Utah County shall follow the Chemical Notification Form. (See Attachment A.) A facsimile copy of the completed Chemical Notification Form will be provided by DCD as verification of data communicated verbally. A facsimile for a heads up call will be provided only after the final disposition of the agent detection is determined. DCD will provide Tooele County and Utah County updated information as soon as it becomes available. [Pg.127]

Verification of data to establish purity and assay of the standard... [Pg.441]

Storrie, H. Semantics and verification of data flow in UML 2.0 activities. Electronic Notes in Theoretical Computer Science 127(4), 35-52 (2005) Stuckenschmidt, H., Klein, M. Integrity and change in modular ontologies. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI O3), Acapulco, Mexico, pp. 900-905. Morgan Kaufmann, San Francisco (2003)... [Pg.844]

Statistical criteria are more correct for verification of the adequacy of die mo l [67]. First of all, as for the question about equality, the average values of Aj and A2 (index 1 corresponds to the model, and index 2 to the experimental data) should be considered during the verification of data. It is possible to compare the average values of dispersions S, and 82 only at the uniformity of complex results. [Pg.85]

It is very likely that incomplete or missing records would prevent the verification of data integrity. Source records should be complete to facilitate an understanding of actual study conduct for critical phases of method development, method validation, and subject sample analysis. The records should confirm whether the testing was conducted in an appropriate manner, with well-designed and optimally controlled experiments. The documentation of actual laboratory events should demonstrate that the quantitative measures are suitable to achieve the objectives of the clinical or nonclinical protocol. The records should confirm that the reported results accurately reflect the actual concentration of the analyte in the biological matrix. It should be noted that the failure to adequately document critical details of study conduct has resulted in rejection of bioanalytical data for regulatory purposes. [Pg.328]

To comply with the new legislation the sponsor needs to develop a set of standard operation procedures (SOPs) to cover all areas of trial activities. A quality system should be in place to ensure record-keeping and verification of data entry or extraction of data from the case report form (CRF), capture adverse events (AEs), serious adverse events (SAEs) and unexpected serious adverse reactions (SUSARs) and report in an expedited manner data transfer from source data to database and archiving of the source data for audit purpose. GCP and trial specific training should be carried out and recorded in a timely manner. [Pg.91]

Following connection of the analytical equipment to the core LIMS, formal verification of data values within the core LIMS database, screen displays and reports may be performed. The vehicle for this testing will be the second stage OQ Protocol. [Pg.282]

The controls to be applied to the possible propagation of data errors across the system hierarchy. These controls might include specific requirements for the verification of data at the system boundary. [Pg.269]

Systems that make extensive use of data will require additional safety arguments to address the use made of data, the verification of the data and the means used to control the propagation of data errors across the sterns hierarchy. The use of data presents additional safety requirements where data may be updated or passed across the stem boundary during its normal operation. One such requirement is the verification of data at the stem boundary... [Pg.271]

Verification of Data Transformations (Using Event-B Contexts)... [Pg.101]

RAID level 2 is rarely used in commercial apphcations, but is another means of ensuring data is protected in the event drives in the subsystem incur problems or otherwise fad. This level builds fault tolerance around Hamming error correction code (ECC), which is often used in modems and sohd-state memory devices as a means of maintaining data integrity. ECC tabulates the numerical values of data stored on specific blocks in the virtual drive using a formula that yields a checksum. The checksum is then appended to the end of the data block for verification of data integrity when needed. [Pg.1588]

When the contamination of foodstuffs and beverages has been assessed by means of models, calculation of the source related doses is straightforward. Modification of the contamination of foodstuffs due to food processing and cooking practices may be taken into account at this stage of the dose calculation, but cautious verification of data is necessary. [Pg.79]

However, regardless of how much validation and verification of data is done, in practice the use of the absolute value predictions of common QRA models (the most often technique used within the railway industry is the fault tree) in safety reasoning is very arduous to assure and justify. [Pg.178]

Re-verification of data and recovery of documents confirming the execution of the service Re-labeling and specify appropriate dimensions, enter correct data into the system... [Pg.2421]

Check and verification of data that is fed into the computer. [Pg.168]

Sequence number Time out in reception Identification for the sender and receiver Verification of data consistency... [Pg.407]


See other pages where Verification of data is mentioned: [Pg.750]    [Pg.2500]    [Pg.44]    [Pg.318]    [Pg.21]    [Pg.45]    [Pg.7]    [Pg.407]    [Pg.413]   
See also in sourсe #XX -- [ Pg.220 ]




SEARCH



Data verification

Verification

© 2024 chempedia.info