Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Data generation methods

The purpose of this article is to clarify the assessment of residue analytical methods in the context of Directive 91/414/EEC. After discussing the legal and historical background, requirements for enforcement methods as well as data generation methods are reviewed. Finally, an outlook over further developments in the assessment and validation of analytical methods is provided. [Pg.15]

Commission Directive 96/46/EC of 16 July 1996, amending Annex II to the Directive 91/414/EEC, is the basis for the assessment of residue analytical methods for crops, food, feed, and environmental samples." Provisions of this Directive cover methods required for post-registration control and monitoring purposes but not data generation methods. Because it is necessary to provide applicants as precisely as possible with details on the required information, the guidance document S ANCO/825/00 rev. 6 dated 20 June 2000 (formerly 8064/VI/97 rev. 4, dated 5 December 1998)" was elaborated by the Commission Services in cooperation with the Member States. [Pg.20]

Guidance document SANCO/3029/99 2000 Details concerning data generation methods... [Pg.20]

In many cases, the methods used to solve identification problems are based on an iterative minimization of some performance criterion measuring the dissimilarity between the experimental and the synthetic data (generated by the current estimate of the direct model). In our case, direct quantitative comparison of two Bscan images at the pixels level is a very difficult task and involves the solution of a very difficult optimization problem, which can be also ill-behaved. Moreover, it would lead to a tremendous amount of computational burden. Segmented Bscan images may be used as concentrated representations of the useful... [Pg.172]

Measurements have been made in a static laboratory set-up. A simulation model for generating supplementary data has been developed and verified. A statistical data treatment method has been applied to estimate tracer concentration from detector measurements. Accuracy in parameter estimation in the range of 5-10% has been obtained. [Pg.1057]

Real time. A data-acquisition method in which the mass spectra are generated within the same time frame as the original experiment. [Pg.431]

The simulation models of the flow-sheeting system must make frequent requests for properties at specific temperatures, pressures, and compositions. Computer-program calls for such data are usually made in a rigorously defined manner, which is independent of both the point data generation models and the particular components. These point generation routines provide the property values, using selected methods that base their calculations on a set of parameters for each component. [Pg.76]

The intention of this chapter has been to provide an overview of analytical methods for predicting and reducing human error in CPI tasks. The data collection methods and ergonomics checklists are useful in generating operational data about the characteristics of the task, the skills and experience required, and the interaction between the worker and the task. Task analysis methods organize these data into a coherent description or representation of the objectives and work methods required to carry out the task. This task description is subsequently utilized in human error analysis methods to examine the possible errors that can occur during a task. [Pg.200]

If solutions are generated without reference to experimental data, the methods are usually called ab initio (latin from the beginning ), in contrast to semi-empirical models, which are described in Section 3.9. [Pg.53]

These test methods and the number and complexity of the variables present is related to the level of sophistication of the test. The combination that can influence test data defines the test limitations. Variables are found not only in test methods, but also in other non-test-related areas affecting data generation. Examples include misinterpretation, misuse, or misapplication of the test or any of its integral parts (test setup, test procedure, reporting, etc.) contribute to their limitations (2 to 11, 64,208). [Pg.304]

When a diffracted X-ray beam hits a data collection device, only the intensity of the reflection is recorded. The other vital piece of information is the phase of the reflected X-ray beam. It is the combination of the intensity and the phase of the reflections that is needed to unravel the contributions made to the diffraction by the electrons in different parts of the molecule in the crystal. This so-called phase problem has been a challenge for theoretical crystallographers for many decades. For practical crystallography, there are four main methods for phasing the data generated from a particular crystal. [Pg.282]

This chapter has provided an overview of the main issues for computing and computational methods to support this work. For the past decade or so, the main limitations that have emerged are not in the amount or type of computational hardware that is available. The real issues are in providing a computational environment for informatics support and streamlining of the calculations. It is here that major efforts are still required to ensure effective integration of the methods and data generated into the drug discovery process. [Pg.296]

In the earliest SFG experiments [Tadjeddine, 2000 Guyot-Sionnest et al., 1987 Hunt et al., 1987 Zhu et al., 1987], a first-generation data acquisition method was used, and, because of the limited signal-to-noise ratios, IR attenuation by the electrolyte solution was a substantial handicap. So, in earlier SFG studies, as in IRAS studies, measurements were performed with the electrode pressed directly against the optical window [Baldelli et al., 1999 Dederichs et al., 2000]. With the in-contact geometry, the electrolyte was a thin film of uncertain and variable depth, probably of the order of 1 p.m. However, the thin nonuniform electrolyte layers can strongly distort the potential/coverage relationship and hinder the ability to study fast kinetics. [Pg.378]

The non-NADA method trial process mirrors the NADA process. Methods are developed, reviewed for scientific and technical soundness, and validated in multiple laboratories, and the data generated are analyzed to determine if the method is suitable for its intended use. [Pg.79]

Guidelines for acceptability of NADA and non-NADA methods are the same. For the determinative procedure, the criteria described in Method Criteria for accuracy and precision are used to evaluate data generated at participating laboratories. There are no criteria for accuracy in the analysis of the incurred residue samples however, the overall data set is reviewed to see if there is general agreement between results obtained by contract laboratories and relative to the levels reported in the sponsor s laboratory. [Pg.93]

Dilution of concentration acceptable if calibration, accuracy and precision remain so. Data generated during tests of other characteristics may provide this requirement Where the method does not permit recovery to be estimated, accuracy and precision are those of calibration Storage times should reflect those likely to be required... [Pg.118]

Standard Operating Procedures (160.81) methods in writing that management is satisfied are adequate to ensure the quality and integrity of data generated in the course of a study. [Pg.971]


See other pages where Data generation methods is mentioned: [Pg.32]    [Pg.130]    [Pg.32]    [Pg.130]    [Pg.97]    [Pg.109]    [Pg.167]    [Pg.135]    [Pg.55]    [Pg.56]    [Pg.126]    [Pg.251]    [Pg.346]    [Pg.457]    [Pg.475]    [Pg.42]    [Pg.67]    [Pg.152]    [Pg.156]    [Pg.75]    [Pg.93]    [Pg.606]   
See also in sourсe #XX -- [ Pg.31 ]




SEARCH



Data Method

Data generation

EU data generation method validation

Generation methods

© 2024 chempedia.info