Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Real-world validation

One must develop systematic approaches to safety-critical software validation while minimizing casualties, including animals used in clinical testing. Real world validation through deployment, like command and control systems in military war game systems, is often impossible. Yet, one must develop approaches to develop credible safety assurance. [Pg.152]

Keywords Service-life prediction Real-world validation Degradation pathways Degradation-rate model Accelerated life testing Cumulative damage model... [Pg.21]

Keywords Service-life prediction Real-world validation Degradation pathways... [Pg.135]

Such models can be used to perform in silico experiments, for example by monitoring the response of a system or its components to a defined intervention. Model output - predictions of biological behaviour - is then validated against in vitro or in vivo data from the real world. [Pg.134]

Test a substantial number of compounds. VS methods generally offer enrichment, but most ranked hit lists contain a significant proportion of false positives. Hitlists should be scaled to 1-5% of the compounds in the virtual library screened. In many real world situations, the computational chemist is being asked to choose lists of compounds representing 0.1% or less of the compounds screened (e.g., the best 100 of 100,000 compounds). Typically, VS methods have been validated considering 1%, 5%, or 10% of the total number of compounds in the VS collection. By following up on more compounds, one increases the probability of impact from VS. [Pg.117]

The illustrative examples could be produced only by using both fire dynamics and human factor information, which contains many crude approximations to the real world and omits completely many important effects. However, it is clear that real progress is being made toward attaining a sufficiently accurate predictive understanding of fire and its consequences so that a performance code can eventually be attained. The computer fire codes need to be made more comprehensive. There needs to be a mechanism set up to evaluate the validity of computer fire codes for use with a legal performance code, just as is done with all other legal codes. [Pg.82]

Validation of the SIS functionality is performed as part of a site acceptance test (SAT). Validation involves a full functional test that demonstrates the SIS actually works in the real-world installation. It proves the SIS devices execute the logic according to the specification and ensures that the SIS and its devices interact as intended with other systems, such as the BPCS and operator interface. From a systematic error standpoint, the SAT also provides an opportunity for a first-pass validation of the procedures developed for the operating basis (see next subsection). [Pg.104]

For medical professionals, vinyl medical products are essential life-saving tools. Over the last several years, however, activist groups like Greenpeace and its affiliate Health Care Without Harm have attempted to discredit the use of vinyl by alleging health and environmental risks through its use and disposal. The fact is vinyl has been used safely in medical products for over 40 years, and hundreds of studies and billions of real-world patient exposures are a testament to its track record of safe use. The benefits and proven safety of vinyl medical products are described, and the validity of activist allegations is addressed. [Pg.61]

Research in analytical chemistry is clearly an area where automation has a significant role to play. It is important that research data is fuUy validated and as accurate as possible. While it is not always possible to automate entire processes, the use of automated carousels to feed samples into a reaction system is an obvious area to improve the quality and rate of generation of data. It wiU also allow the researchers to quickly validate their proposed methodology on real-world samples and optimize the performance characteristics. This naturally requires a very close relationship between the researchers and the ultimate end-users of the analytical product. Given a good return for the investment, I am sure that the initial investment to automate the research activity will be justified and forthcoming. [Pg.235]

Traditionally, the education that chemists and chemistry laboratory technicians receive in colleges and universities does not prepare them adequately for some important aspects of the real world of work in their chosen field. Today s industrial laboratory analyst is deeply involved with such job issues as quality control, quality assurance, ISO 9000, standard operating procedures, calibration, standard reference materials, statistical control, control charts, proficiency testing, validation, system suitability, chain of custody, good laboratory practices, protocol, and audits. Yet, most of these terms are foreign to the college graduate and the new employee. [Pg.3]

Ultimately a model is a simplification of real-world behaviour. To be useful, it must produce predictions of sufficient accuracy in a reasonable time span if it is quicker to do the experiments than to develop and run the model, then the model is worthless (except as an academic exercise). In choosing an approach for modelling a system, a compromise must be made between complexity and simplicity. A very complex model, which includes a detailed description of the many physico-chemical processes involved, may give an accurate prediction over a wide range of conditions. However, there will be many parameters to set from experimental data, and so development and validation will be time consuming. An over complex model may also have a protracted run-time. However, a very simple model may be quick to develop and run, but if the predictions are far from reality, this model is also useless. [Pg.59]

Once the above-discussed components of the model have been determined, they are added to the final model of a monolith (or even filter) reactor. The monolith reactor model has already been described in Section III. The next stage is to validate the model by comparing the predictions of the model based on laboratory data, with the real-world data measured on an engine bench or chassis dynamometer. At this stage the reason(s) for any discrepancies between the prediction and experiment need to be determined and, if required, further work on the kinetics done to improve the prediction. This process can take a number of iterations. Model validation is described in more detail in Section IV. D. Once all this has been done the model can be used predictively with confidence. [Pg.62]

I once personally (once Ya right ) pushed the envelope to see how much of this scientific data (most of which was available 15 years ago) could be validated in the real world. I suppressed my cortisol (chemically), increased anabolism (also chemically) and increased metabolism (of course chemically) for 30 days. I ingested protein (complete) levels based upon basal plus 20%, and ate 2.5g of carbs (mostly maltodextrin and veggies) per LB of bodyweight while allowing fats to fall where they may. 30 days later I was 33 LBS heavier with a slightly lower body fat level. Ya pretty cheesy, huh ... [Pg.5]

Finally, a cognitive performance assay used repeatedly during sleep deprivation should have high test-retest reliability it should be demonstrated to be sensitive to a large proportion of the performance phenomena associated with sleep loss and it should have the capacity to reflect aspects of real world performance (i.e., ecological validity). [Pg.43]

Data relevancy and validity are two different concepts, a fact that is not always recognized by all project participants. Data can be perfectly valid, and yet irrelevant for their intended use. Conversely, the quality of data may be flawed in some way, as is usually the case in the real world, nevertheless they can be used for project decisions. [Pg.1]

Evaluate and validate, under real-world and heavy-load conditions, emerging multi-fuel energy concepts for deployment of hydrogen-powered vehicles in suburban and rural regions. [Pg.110]

Collect and evaluate real-world data to address safety and environmental issues, develop statistically validated codes and standards, formulate policies and regulations, and understand reliability and large-scale deployment of hydrogen technology under diverse operating conditions. [Pg.110]

Models. Both empirical and simulation models are needed. (Models can help bridge the gap between experimental conditions and the real world and between actual observations and predictions. Obviously, models can be no better than the data used to construct them, and much of these data will come from the tests described above. The tremendous advantage of models comes as increased experience and better data bases permit their refinement to the degree that they can be used in place of, or to guide some of the more complex testing described above. Well-validated models can be a powerful research and regulatory tool.)... [Pg.388]

Because field quantization falls outside the scope of the present text, the discussion here has been limited to properties of classical fields that follow from Lorentz and general nonabelian gauge invariance of the Lagrangian densities. Treating the interacting fermion field as a classical field allows derivation of symmetry properties and of conservation laws, but is necessarily restricted to a theory of an isolated single particle. When this is extended by field quantization, so that the field amplitude rjr becomes a sum of fermion annihilation operators, the theory becomes applicable to the real world of many fermions and of physical antiparticles, while many qualitative implications of classical gauge field theory remain valid. [Pg.201]

Another important consideration in the selection of a test set is to ensure that the chemicals in the data set relate to the real problem in question. It should be emphasized that the QSAR models developed in our project are used primarily to predict the activity of environmental chemicals, mostly pesticides and industrial chemicals. A data set reported by Nishihara et al. (Nishihara et al., 2000) was also selected as a test set. This data set contains 517 chemicals tested with the yeast two-hybrid assay, of which over 86% are pesticides and industrial chemicals. Only 463 chemicals were used for this validation study after structure processing. Only 62 chemicals were categorized as active on the basis of having on activity greater than 10% of 10 7M H2, as defined in the original paper (Nishihara et al., 2000). The majority of the chemicals were inactive, which is similar to the real-world situation where inactive chemicals are expected from a large proportion of those in the environment. [Pg.309]

Recall the reciprocity between matter and curvature, implied by the theory of general relativity, to argue that the high-pressure condition at Z/N = 1 corresponds to extreme curvature of space-time caused by massive objects such as quasars, and the like. The argument implies that the Schrodinger solution is valid in empty, flat euclidean space-time, that Z/N = r corresponds to the real world, Z/N = 1 occurs in massive galactic objects where elemental synthesis happens, and Z/N > 1 implies infinite curvature at a space-time singularity. [Pg.136]

Aside from the continuity assumption and the discrete reality discussed above, deterministic models have been used to describe only those processes whose operation is fully understood. This implies a perfect understanding of all direct variables in the process and also, since every process is part of a larger universe, a complete comprehension of how all the other variables of the universe interact with the operation of the particular subprocess under study. Even if one were to find a real-world deterministic process, the number of interrelated variables and the number of unknown parameters are likely to be so large that the complete mathematical analysis would probably be so intractable that one might prefer to use a simpler stochastic representation. A small, simple stochastic model can often be substituted for a large, complex deterministic model since the need for the detailed causal mechanism of the latter is supplanted by the probabilistic variation of the former. In other words, one may deliberately introduce simplifications or errors in the equations to yield an analytically tractable stochastic model from which valid statistical inferences can be made, in principle, on the operation of the complex deterministic process. [Pg.286]


See other pages where Real-world validation is mentioned: [Pg.88]    [Pg.88]    [Pg.517]    [Pg.125]    [Pg.46]    [Pg.725]    [Pg.1079]    [Pg.338]    [Pg.186]    [Pg.196]    [Pg.217]    [Pg.216]    [Pg.416]    [Pg.231]    [Pg.22]    [Pg.188]    [Pg.40]    [Pg.73]    [Pg.20]    [Pg.193]    [Pg.275]    [Pg.46]    [Pg.59]    [Pg.399]    [Pg.144]    [Pg.108]    [Pg.44]    [Pg.131]    [Pg.313]   
See also in sourсe #XX -- [ Pg.22 , Pg.23 ]




SEARCH



Real world

© 2024 chempedia.info