Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Validation of data

The most complicated aspect of the CHIRBASE project effort is the actual incorporation and validation of data. It is largely due to the complexity of the problem and to the difficulty of extracting and interpreting the relevant information, since the vast majority of all useful data is disseminated in the papers rather than in a user-readable or a computer-readable form. [Pg.100]

The new millenium professional pharmacist with rigorous training in the scientific and clinical properties of drug substances and pharmaceutical formulations should be well able to dispassionately evaluate bioe-quivaleny data and give an objective, professional opinion as to validity of data purported to demonstrate generic bioequivalence. [Pg.748]

Several other successful applications of the low-temperature procedure to the thermal control and analysis of multistep enzyme reactions could be described. We prefer to cite appropriate papers (Douzou, 1974, 1977a,b Fink, 1976a) and to discuss two important problems raised by the present procedure, namely the validity of data obtained in such bizarre media and the necessity of obtaining suitable data on the conformational changes in proteins during their reaction pathways. [Pg.267]

It is widely understood within the industry that risk is defined as the combination of the probability of harm and the severity of that harm. Within the pharmaceutical industry whenever risk is considered the equipment or product being assessed must be viewed in the context of the protection of the patient. From our perspective, analytical instruments may impact on the validity of data that determines the safety and efficacy of drug products, or on the quality of the drug product. They may also impact on the identity or potency of the drug product and therefore it is important to ensure that risk management is performed throughout the complete life cycle of the instrument. [Pg.172]

Glover DE, Hall RG, Coston AW, Trilling RJ. Validation of data obtained during exposure of human volunteers to air pollutants. Computers Biomed Res 15(3) 240-249, 1982. [Pg.239]

Software is not a physical entity and, unlike some hardware failures, software failures occur without advanced warning. One of the most common software failures is branching, that is, the ability to execute alternative series of commands based on differing inputs. The software branching capacity makes the commands extremely complex and difficult to validate once errors occur as an answer of a specific input, and until the introduction of that specific input error has not been detected. Software input can be almost any data and, and since it is impossible to introduce all data into a software, validation of data is extremely difficult. Thus, results are considered to be of high confidence level. The majority of software problems occur as a consequence of errors in the software design and development and are not directly related to the software manufacture. It is simple to manufacture several software copies that work perfectly and as the original one. [Pg.834]

For the pristine chemistry studies which include studies such as hydrolysis, soil and water photolysis, soil dissipation, and rotational crop under environmental fate, metabolism studies, residue studies, and product chemistry studies, such as vapor pressure, octanol-water partition coefficient, and water solubility, the total study is audited. This includes the GUP issues, such as adherence to protocols, SOPs, and record accountability completeness of raw data the validation of data points and the overall scientific issues. [Pg.89]

Completeness is a measure of whether all the data necessary to meet the project objectives have been collected. Completeness is calculated only after the rest of the PARCC parameters have been calculated or qualitatively evaluated to determine the validity of data. It is the final and all-inclusive indicator of data usability. The DQI of completeness enables us to determine whether data of acceptable quality have been collected in sufficient quantity to meet the project objectives. [Pg.44]

The Concept of V AM has been promoted since the start of the programme as a way of encouraging laboratories, and their managers, to adopt best practice. The message is clear by adopting six straightforward principles organisations can ensure their results are fit for purpose, demonstrate the validity of data to their customers, and achieve consistency with results obtained elsewhere. These six principles,... [Pg.288]

Finally, there is the need for proper documentation, which can be in written or electronic forms. These should cover every step of the measurement process. The sample information (source, batch number, date), sample preparation/analytical methodology (measurements at every step of the process, volumes involved, readings of temperature, etc.), calibration curves, instrument outputs, and data analysis (quantitative calculations, statistical analysis) should all be recorded. Additional QC procedures, such as blanks, matrix recovery, and control charts, also need to be a part of the record keeping. Good documentation is vital to prove the validity of data. Analyt-... [Pg.27]

Development of the validation process of IR data based on the electronic version as the master Experience of the MS data so far has shown that validation of data on the basis of the electronic version is much easier and more practical. Efforts are underway to identify the software that can be used to manage the electronic IR data in order to facilitate its validation on the basis of the electronic version. [Pg.145]

Zomp of the Science and Technology Center are also thanked for their contributions. Correspondence with Dr. David Jones of the Air Force Materials Laboratory regarding the validation of data was very helpful. [Pg.91]

The validation of data capture should include the following ... [Pg.546]

For stabilizers from the group of HALS, the value of M in the range of 3000 to 5000 was reported as an optimum [316]. The data were obtained with PP tapes (50 pm) doped with poly(l,2,2,6,6-petamethyl-4-piperidyl acrylate) of different M and the validity of data thus obtained was extended for all polymeric HALS. It should be mentioned that the application of polymeric stabilizers having lower M assures a greater flexibility in application in various polymers. [Pg.148]

Using third-party review and validation of data may be beneficial. [Pg.98]

Batch validation may also be run on demand if immediate validation of data is required. [Pg.555]

Quality control for clinical data within data management includes computerized validation of data in the database and second-pass data entry. These activities are performed to ensure that data are complete. [Pg.556]

Plutonium and Sr were quantified in mosses and lichens by Testa and associates (83). Validity of data was supported by the adoption of a quality control system and through participation in international intercomparison trials. [Pg.28]

This includes the archiving and curation of data and, as importantly, the validation of data based on quality assurance guidelines to ensure comparability between data sets collected by different countries. Publication of data should be encouraged and data should be freely exchanged. At present there is no established route by which this might occur since, unlike the CCAMLR Secretariat, the CEP has no support at all. However, with the establishment of Web sites for both the Treaty and the CEP, new possibilities are opening up for posting data in a Web archive. An initiative from the CEP seems to be called for here. [Pg.45]

Out-of-laboratory measurements are undertaken across a broad range of industrial and analytical sectors for a variety of reasons in clinical and medical diagnostics for the control of chemical and petrochemical production processes and to monitor emissions and discharges to the environment. The validity of data derived from such measurements is clearly of vital importance, for example to demonstrate compliance with environmental legislation. However, it is particularly difficult to obtain valid and reliable measurements outside the laboratory. The inability to control the environment in which the measurements are made and the use of untrained operators both have potential to impact significantly on the reliability of data. The situation is made worse because of the lack of adequate QA and QC procedures, the shortage of reference materials and calibration standards, and... [Pg.144]

The current state of practice also varies considerably across sectors. Under the VAM programme, the aims are to increase user confidence in the validity of data derived from portable measuring equipment and to improve the reliability and comparability of measurements made using portable equipment. The possibility of establishing formal QA and QC procedures and proficiency testing for out-of-laboratory measurements will also be explored. [Pg.145]

CFR Part 11 validation of data acquisition system and laboratory information management system (TIMS)... [Pg.517]

In manual recording the entries made on a sheet of paper can be dated and signed to attest to the validity of data and to accept responsibility corrections to them remain visible unless the erasure of the superseded data has been done very artistically. These safeguards have to be retained in the use of computers for data capture, processing and storage, since, e.g., the bits and bytes in computer memory are invisible, and corrections to them will under normal circumstances leave no trace. GLP therefore wants to ensure that data safety and integrity remains the same in electronically as in manually recorded data, irrespective of how they were recorded, and that reconstruction of the way in which the final results and conclusions were obtained remains fully possible. [Pg.198]

Intercalibration and Quality Control. The large increase in the number of laboratories measuring or attempting to measure fossil fuel compounds in environmental samples raises a very serious question that has been posed many times previously, How comparable are analyses from different laboratories The validity of data on incorporation of hydrocarbons into surface sediments at the concentration level of 1 mg/g is of prime importance to fate-and-effect studies in coastal and continental shelf areas. It is not difficult to see how this could also be of importance to legal considerations. The need for intercalibration seemed obvious and a few efforts to establish this practice were completed (15). [Pg.14]

Evaluation and validation of data Evaluation and validation of the analytical data is only possible if the species are determined quantitatively. The calculation of... [Pg.1667]


See other pages where Validation of data is mentioned: [Pg.223]    [Pg.4]    [Pg.267]    [Pg.106]    [Pg.69]    [Pg.361]    [Pg.543]    [Pg.1712]    [Pg.480]    [Pg.73]    [Pg.98]    [Pg.8]    [Pg.9]    [Pg.297]    [Pg.33]    [Pg.82]    [Pg.445]    [Pg.251]    [Pg.384]    [Pg.298]    [Pg.40]    [Pg.198]    [Pg.16]    [Pg.38]   
See also in sourсe #XX -- [ Pg.241 , Pg.273 ]




SEARCH



Data validation

Data validity

© 2024 chempedia.info