Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Verification data study

A final point is the value of earlier (old) validation data for actual measurements. In a study about the source of error in trace analysis, Horwitz et al. showed that systematic errors are rare and the majority of errors are random. In other words, the performance of a laboratory will vary with time, because time is related to other instruments, staff, chemicals, etc., and these are the main sources of performance variation. Subsequently, actual performance verification data must be generated to establish method performance for all analytes and matrices for which results will be reported. [Pg.131]

Specifications for the finished product Two specifications at release and end of shelf-life List general characteristics, specific standards tests and limits for results for the finished product must be provided Analytical test procedures described (physicochemical properties, identity of API) Quantitative determination of active, deviations, purity tests, pharmaceutical tests, colouring antimicrobial or chemical preservatives, results of validation studies, comments on the choice of routine tests and standards provided Copy of pharmacopoeia monograph and verification data Results of batch analysis (inc. date of manufacture, place of manufacture, batch size and use of batch tested) ... [Pg.309]

The trimetaUic uranyl cluster (U02)3(C03) 3 has been the subject of a good deal of study, including nmr spectroscopy (179—182) solution x-ray diffraction (182), potentiometric titration (177,183,184), single crystal x-ray diffraction (180), and exafs spectroscopy in both the soHd and solution states (180). The data in this area have consistendy led to the proposal and verification of a trimeric (U02)3(C03) 3 cluster (181,182,185). [Pg.327]

In this study detailed fault trees with probability and failure rate calculations were generated for the events (1) Fatality due to Explosion, Fire, Toxic Release or Asphyxiation at the Process Development Unit (PDU) Coal Gasification Process and (2) Loss of Availability of the PDU. The fault trees for the PDU were synthesized by Design Sciences, Inc., and then subjected to multiple reviews by Combustion Engineering. The steps involved in hazard identification and evaluation, fault tree generation, probability assessment, and design alteration are presented in the main body of this report. The fault trees, cut sets, failure rate data and unavailability calculations are included as attachments to this report. Although both safety and reliability trees have been constructed for the PDU, the verification and analysis of these trees were not completed as a result of the curtailment of the demonstration plant project. Certain items not completed for the PDU risk and reliability assessment are listed. [Pg.50]

As this kind of verification of classical J-diffusion theory is crucial, the remarkable agreement obtained sounds rather convincing. From this point of view any additional experimental treatment of nitrogen is very important. A vast bulk of data was recently obtained by Jameson et al. [270] for pure nitrogen and several buffer solutions. This study repeats the gas measurements of [81] with improved experimental accuracy. Although in [270] Ti was measured, instead of T2 in [81], at 150 amagat and 300 K and at high densities both times coincide within the limits of experimental accuracy. [Pg.221]

This theoretical result is completely substantiated by experiment. Goldschmidt,31 from a study of crystal structure data, observed that the radius ratio is large for fluorite type crystals, and small for those of the rutile type, and concluded as an empirical rule that this ratio is the determining factor in the choice between these structures. Using Wasastjerna s radii he decided on 0.67 as the transition ratio. He also stated that this can be explained as due to anion contact for a radius ratio smaller than about 0.74. With our radii we are able to show an even more satisfactory verification of the theoretical limit. In Table XVII are given values of the radius ratio for a large number of compounds. It is seen that the max-... [Pg.276]

Audits of each phase of the study should include personnel training, preparation of collection forms, application calibration, each sample collection procedure, sample transport, each type of chemical analysis, data recording, data entry, data verification and data storage. Data collection in the field is often tedious if automated logging devices are not in place. To ensure data integrity, the paper and ink used for field studies should be waterproof. Each data collection form should contain appropriate locations for information detailing the time and location of sample collection, sample transport and sample analysis. Data collection forms should be stored in an orderly fashion in a secure location immediately upon return of field teams from the field at the end of each day. It is also important for data quality for studies to collect necessary field data seven days per week when required. In our experience, poor study quality is likely when field sample and data collection do not proceed on weekends. [Pg.946]

Each data point must be transferred from data sheets into spreadsheets or databases. Verification of each datum should be performed by an individual who did not enter the data being verified. Audits of each phase of the study should be performed (i.e. preparation of collection forms, application calibration, each type of sample collection, sample transport, each type of chemical analysis, data recording, data entry, data verification and data storage). [Pg.946]

GLP regulations require QA personnel to inspect/audit each study conducted, but the extent to which QA personnel are involved in software development and the val-idation/verification process varies from company to company. In some companies, there is little or no QA involvement in these processes, whereas in others QA personnel are involved. QA personnel can provide assistance in the area of vendor audits for purchased software or can conduct inspections of in-house software development to ensure that internal procedures are being followed. QA personnel, who conduct in-process inspections and review the resulting data and validation report for accuracy, could provide inspection support during the validation and verification process. During system development and validation, properly trained QA personnel can provide the regulatory advice needed to ensure that the system will meet government standards. QA personnel become more familiar with the system(s) that will be used when they are involved early in the validation process. [Pg.1048]

Besides the experimental data mentioned above, the kinetic dependencies of oxide adsorption of various metals are also of great interest. These dependencies have been evaluated on the basis of the variation of sensitive element (film of zinc oxide) conductivity using tiie sensor method. The deduced dependencies and their experimental verification proved that for small occupation of the film surface by metal atoms the Boltzman statistics can be used to perform calculations concerning conductivity electrons of semiconductors, disregarding the surface charge effect as well as the effect of aggregation of adsorbed atoms in theoretical description of adsorption and ionization of adsorbed metal atoms. Considering the equilibrium vapour method, the study [32] shows that... [Pg.191]

For either of the ternary complex mechanisms described above, titration of one substrate at several fixed concentrations of the second substrate yields a pattern of intersecting lines when presented as a double reciprocal plot. Hence, without knowing the mechanism from prior studies, one can not distinguish between the two ternary complex mechanisms presented here on the basis of substrate titrations alone. In contrast, the data for a double-displacement reaction yields a series of parallel lines in the double reciprocal plot (Figure 2.15). Hence it is often easy to distinguish a double-displacement mechanism from a ternary complex mechanism in this way. Also it is often possible to run the first half of the reaction in the absence of the second substrate. Formation of the first product is then evidence in favor of a doubledisplacement mechanism (however, some caution must be exercised here, because other mechanistic explanations for such data can be invoked see Segel, 1975, for more information). For some double-displacement mechanisms the intermediate E-X complex is sufficiently stable to be isolated and identified by chemical and/or mass spectroscopic methods. In these favorable cases the identification of such a covalent E-X intermediate is verification of the reaction mechanism. [Pg.45]

With respect to method application, once validation has been satisfactorily completed, there is little question that use of the analytical method in worker safety and re-entry studies falls under the full requirements of the GLP Standards. In addition, there should be an adequate level of quality control measurements taken in conjunction with the specimens so as to provide for a meaningful assessment of accuracy and precision, as well as verification of freedom from artifactual interferences. Along with these measurements there needs to be reasonably rigid data acceptance criteria in place (usually established during validation) which are consistently applied during the course of the specimen analytical phase of the study. [Pg.159]

A dramatic example of this type of error was discussed by Donigian, (6j at the Pellston workshop based on the Iowa study described earlier (8 ) Figure 3 shows the calibration (top figure, 1978 data) and verification (bottom figure, 1978) results. A simulated alachlor concentration value of greater than 0.1 mg/1 occurred on May 27, 1978, (top figure) whereas the observed... [Pg.161]

Frequency domain performance has been analyzed with goodness-of-fit tests such as the Chi-square, Kolmogorov-Smirnov, and Wilcoxon Rank Sum tests. The studies by Young and Alward (14) and Hartigan et. al. (J 3) demonstrate the use of these tests for pesticide runoff and large-scale river basin modeling efforts, respectively, in conjunction with the paired-data tests. James and Burges ( 1 6 ) discuss the use of the above statistics and some additional tests in both the calibration and verification phases of model validation. They also discuss methods of data analysis for detection of errors this last topic needs additional research in order to consider uncertainties in the data which provide both the model input and the output to which model predictions are compared. [Pg.169]

We thus see that the electronic theory of heterogeneous photocatalytic reactions not only makes an attempt to explain, from the unified point of view, a large amount of experimental data, often contradictory at first glance, but also predicts new effects awaiting experimental verification. No doubt, the photocatalytic effect on semiconductors which has only recently become the subject matter of scientific research requires further experimental and theoretical study. [Pg.206]

Data on the NOAELS and LOAELS for death are presented in Table 2-1 for several animal species. The data presented indicate that species differences exist with respect to acute lethal effects. Dogs appear to be the most susceptible species, but this is based on studies involving only a few animals. The cause of death varied among test species. In guinea pigs, death resulted from pulmonary irritation (see Section 2.2.1.2) while in the other species convulsions and coma occurred (see Section 2.2.1.4) (Dudley and Neal 1942). It should be noted that this study is based on nominal concentrations without analytical verification. [Pg.24]

Figure 12.3. Benchmark of peer-reviewed academic reports of organic semiconductor device field-effect mobility versus time of report. All data points are for spin-coated organic semiconducting transistors. Solid points are derived from the benchmark study completed in 2002 by Brazis and Dyrc at Motorola (unpublished). The curve is a calculated estimation, based on these data, of what the expected mobility values will be in the future. The open points are data derived in 2005 from the public journals for verification of the 2002 prediction.6 38... [Pg.382]

All mathematical models require some assumed data on the source of release for a material. These assumptions form the input data which is then easily placed into a mathematical equation. The assumed data is usually the size or rate of mass released, wind direction, etc. They cannot possibly take into account all the variables that might exist at the time of the incident. Unfortunately most of the mathematical equations are also still based on empirical studies, laboratory results or in some cases TNT explosion equivalents. Therefore they still need considerable verification with tests simulations before they can be fully accepted as valid. [Pg.53]


See other pages where Verification data study is mentioned: [Pg.58]    [Pg.91]    [Pg.308]    [Pg.169]    [Pg.200]    [Pg.187]    [Pg.106]    [Pg.6]    [Pg.242]    [Pg.55]    [Pg.216]    [Pg.850]    [Pg.932]    [Pg.939]    [Pg.1034]    [Pg.1055]    [Pg.55]    [Pg.87]    [Pg.123]    [Pg.462]    [Pg.180]    [Pg.267]    [Pg.310]    [Pg.507]    [Pg.248]    [Pg.176]    [Pg.23]    [Pg.91]    [Pg.587]    [Pg.107]    [Pg.421]    [Pg.12]    [Pg.13]   
See also in sourсe #XX -- [ Pg.251 , Pg.252 , Pg.253 , Pg.254 , Pg.255 , Pg.256 ]




SEARCH



Data verification

Verification

© 2024 chempedia.info