Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Databases for Validation

To evaluate the enrichment of a pharmacophore model, a proper set of active and inactive molecules has to be screened virtually. The generation of such validation databases is a difficult task. Ideally, a validation database should contain structurally diverse active compounds and inactive molecules that are structurally related to their [Pg.119]

1) Wolber, G. (2009) Phannacophore space-based clustering method implemented in LigandScout 3.0., personal communication. [Pg.119]

Obtaining structural data of biologically tested inactive molecules is even more challenging. A limited number of inactive molecules have been published in literature. Thus, some computational chemistry groups used compounds structurally similar to actives but experimentally not tested for biological activity as putative inactive molecules or the so-called decoys. A prominent validation database [Pg.120]

To build this directory, 2950 active molecules for 40 proteins were derived from literature. For each of the 2950 actives, 36 decoys were added to the database. For this purpose, a total of 95 316 decoys were obtained from a set of drug-like commercially available compounds [62]. However, Irwin [63] reported several benchmark biases (e.g., analogue bias) discovered in the DUD database. Moreover, this fact cannot be denied that some of the commercial compounds used as decoys have biological activity for the corresponding target protein. [Pg.121]


Thus far, most in vitro irritation methods, including Skintex, have relied heavily on the vast Draize rabbit skin database for validation. As previously discussed, the discrepancies in the information generated by the Draize system raise questions about the applicability of this information to irritation reactions in man. [Pg.2651]

Swirling flows are inherently three dimensional. The authors current PIV system measures only two velocity components. A newly acquired stereoscopic PIV system will be used to characterize the 3D flowfield. In addition, pollutant characteristics of high-pressure partially-premixed swirling flames will also be obtained. Together with the PIV velocity and acoustic measurements, a full characterization of both pollutant and noise emissions from such flames will be achieved. It is believed that this will help improve the fundamental understanding of these types of emissions from modern gas-turbine combustors and will provide a valuable database for validating LES models. [Pg.220]

IMES was developed to assist in the selection and evaluation of exposure assessment models and to provide model validation and uncertainty information on various models and their applications. IMES is composed of 3 elements 1) Selection - a query system for selecting models in various environmental media, 2) Validation - a database containing validation and other information on applications of models, and 3) Uncertainty - a database demonstrating apfhieatum nl a mode uncertainty protocol. [Pg.371]

Our specimen database also contains additional parameters that are used to control the data collection process and to provide archival information to each data file written by the collection process. The console display for editing the specimen database is of the "fill in the form" type and the user revises the parameters for each specimen position (including the zeroth) as required. New parameter values are checked for validity at the time they are entered. All other parameters retain the values they possessed during the previous set of analyses. Thus, only minor changes are needed to program for a set of samples similar to the previous ones. All records in the database can be cleared if the analytical conditions are markedly different. [Pg.134]

The updating function for the run database is another editing program that is used to create an entry in the run database for each data collection task in a batch. The run database entry contains optional descriptive information as well as the parameters required to drive the task. The user edits the parameter values to suit his needs for the tasks to be performed. Parameters in the run database are validated before the run entry is made permanent. [Pg.145]

Generating valid in silico models requires high quality databases for model training. True values of VD in human require that the parameters are calculated from pharmacokinetic data measured after intravenous administration. From equation 7 above, calculation of VDSS requires that the dose that enters the bloodstream is known, which can only be guaranteed by intravenous... [Pg.484]

A comprehensive and critical review of food flavonoid literature has led to the development of a food composition database for flavonols, flavones, procyanidins, catechins, and flava-nones. This database can now be used and continuously updated to estimate flavonoid intake of populations, to identify dietary sources of flavonoids, and to assess associations between flavonoid intake and disease. However, there is a need for better food composition data for flavones, procyanidins, and flavanones as current literature is sparse particularly for citrus fruits, fruit juices, and herbs. In addition, anthocyanin food composition data are lacking although validated methods of determination are becoming available. [Pg.246]

Bruno, J. Puigdomenech, I. 1989. Validation of the SKBU1 Uranium thermodynamic database for its use in geochemical calculations with EQ3/6. Materials Research Society Symposium Proceedings, 127, 887-896. [Pg.527]

The hardware and software used to implement LIMS systems must be validated. Computers and networks need to be examined for potential impact of component failure on LIMS data. Security concerns regarding control of access to LIMS information must be addressed. Software, operating systems, and database management systems used in the implementation of LIMS systems must be validated to protect against data corruption and loss. Mechanisms for fault-tolerant operation and LIMS data backup and restoration should be documented and tested. One approach to validation of LIMS hardware and software is to choose vendors whose products are precertified however, the ultimate responsibility for validation remains with the user. Validating the LIMS system s operation involves a substantial amount of work, and an adequate validation infrastructure is a prerequisite for the construction of a dependable and flexible LIMS system. [Pg.518]

If a model is to be used as a query to search for active molecules in a database, a common validation method is to demonstrate its performance on a database for which the pharmacological activity of each compound is known (or at least flagged as active or inactive). Most often, such databases are made artificially for this purpose. Thus, after gathering a set of active compounds, one would seed them in a larger database of randomly selected (and supposedly inactive) molecules, the idea being to mimic some HTS results. The model is finally evaluated according to its ability to search the database for the actives and perform better than a random search (enrichment). [Pg.337]

This example uses a small portion of a traceability matrix for validation of a database application, here named your CRS. ... [Pg.245]

Shelf-life estimation involves developing a thermodynamic model of the dry reagent chemistry. This model is used to make projections of the shelf-life and the estimates are continuously compared with real time data to substantiate the validity of the model or to revise it. A typical study to create a database for shelf-life estimation may consist of subjecting a dry reagent chemistry, in its final packaging format, to continuous thermal stress for 2-3 years in the temperature range of 0-70 °C. An example of a schedule for such a study is shown in Fig. la. At each check point, dry reagent chemistries are removed from each of the stress conditions, allowed to come to thermal equilibrium, and analyzed, and the per-... [Pg.45]

It is part of the early decision-making process to assess which parts of the LIMS are to be subject to full validation and which are to be covered by good IT or engineering testing practice. The function of LIMS within the company computing architecture may mean that it is utilized for both product quality related and non-product-quality-related activities. The validation determination for LIMS ensures that the validation effort is focused toward those system elements with direct GxP impact. The justification for validating only certain parts of LIMS must be based on an assessment of the GxP criticality of the data in the LIMS database, the use of the data and the sources of the data. In the ideal world LIMS would be dedicated to GxP activities. However, the economics of the system may mean that it is not practical to separate GxP and non-GxP activities into separate systems. [Pg.521]


See other pages where Databases for Validation is mentioned: [Pg.695]    [Pg.141]    [Pg.119]    [Pg.98]    [Pg.130]    [Pg.695]    [Pg.141]    [Pg.119]    [Pg.98]    [Pg.130]    [Pg.44]    [Pg.315]    [Pg.582]    [Pg.313]    [Pg.64]    [Pg.174]    [Pg.463]    [Pg.239]    [Pg.447]    [Pg.61]    [Pg.328]    [Pg.137]    [Pg.185]    [Pg.374]    [Pg.787]    [Pg.26]    [Pg.29]    [Pg.190]    [Pg.175]    [Pg.356]    [Pg.37]    [Pg.204]    [Pg.401]    [Pg.178]    [Pg.135]    [Pg.150]    [Pg.124]    [Pg.119]    [Pg.144]    [Pg.355]    [Pg.308]   


SEARCH



© 2024 chempedia.info