Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Future computer validation

A recent paper1 addressed the future of computer validation. The authors of this paper anticipated how the current industry events and trends would affect computer validation. [Pg.173]

R.Buihandojo, D.J.Bergeson, B.Bradley, P.D Eramo, L.Huber, O.Lopez et al. The Future State of Computer Validation, IHiarmaceutical Technology, July/September2001. [Pg.173]

An important aspect of this new approach is the expectation that pharmaceutical and healthcare companies will implement any corrective actions identified as the result of a site inspection across the whole of their operations. Effective coordination of corrective actions is vital for large multinational organizations. An example form that might be used to collate computer validation inspection history is presented in Table 16.1. The FDA and MHRA already have access to inspection databases and have the ability to readily trend data and track repeated offences on particular topics across multiple sites in a firm s organization. Indeed, regulatory authorities may in the future share inspection findings with the MRA parmer regulatory authorities. [Pg.386]

Application of the test substance to the test system is without doubt the most critical step of the residue field trial. Under-application may be corrected, if possible and if approved by the Study Director, by making a follow-up application if the error becomes known shortly after the application has been made. Over-application errors can usually only be corrected by starting the trial again. The Study Director must be contacted as soon as an error of this nature is detected. Immediate communication allows for the most feasible options to be considered in resolving the error. If application errors are not detected at the time of the application, the samples from such a trial can easily become the source of undesirable variability when the final analysis results are known. Because the application is critical, the PI must calculate and verify the data that will constitute the application information for the trial. If the test substance weight, the spray volume, the delivery rate, the size of the plot, and the travel speed for the application are carefully determined and then validated prior to the application, problems will seldom arise. With the advent of new tools such as computers and hand-held calculators, the errors traditionally associated with applications to small plot trials should be minimized in the future. The following paragraphs outline some of the important considerations for each of the phases of the application. [Pg.155]

Software and computer systems that are subject to validation must be designed using strict procedures with sufficient documentation. During the process of system design, strict controls must be in place to allow future validation success. The system designer must ensure that documentation of the system meets minimum requirements necessary to satisfy the needs of the validation team. [Pg.1055]

The toxicological "facts" I have used goes beyond present validated knowledge and thus indicates directions that future work might take to produce the data that can actually be used in a computer fire model. [Pg.82]

The literature of the past three decades has witnessed a tremendous explosion in the use of computed descriptors in QSAR. But it is noteworthy that this has exacerbated another problem rank deficiency. This occurs when the number of independent variables is larger than the number of observations. Stepwise regression and other similar approaches, which are popularly used when there is a rank deficiency, often result in overly optimistic and statistically incorrect predictive models. Such models would fail in predicting the properties of future, untested cases similar to those used to develop the model. It is essential that subset selection, if performed, be done within the model validation step as opposed to outside of the model validation step, thus providing an honest measure of the predictive ability of the model, i.e., the true q2 [39,40,68,69]. Unfortunately, many published QSAR studies involve subset selection followed by model validation, thus yielding a naive q2, which inflates the predictive ability of the model. The following steps outline the proper sequence of events for descriptor thinning and LOO cross-validation, e.g.,... [Pg.492]

Essentially, the eCTD is a transport format for facilitating electronic submissions. The eCTD serves as an interface for industry-to-agency transfer of regulatory information while at the same time, taking into consideration the facilitation of the creation, review, life cycle management, and archival of the electronic submission. The eCTD specification lists the criteria that will make an electronic submission technically valid. The eCTD represents a major advance in the submission of information to support an NDA. In the future, companies may be able to send their submissions to several regulatory authorities simultaneously with a single stroke of a computer key. [Pg.480]

The future development of the chemical mass balance receptor model should include 1) more chemical components measured in different size ranges at both source and receptor 2) study of other mathematical methods of solving the chemical mass balance equations 3) validated and documented computer routines for calculations and error estimates and 4) extension of the chemical mass balance to an "aerosol properties balance" to apportion other aerosol indices such as light extinction. [Pg.94]

It should be clear from the preceding examples that theoretical studies of this type serve not simply to validate computational predictions by detecting potential sources of error, but also to identify the origins of particular spectroscopic characteristics, establish trends, and uncover correlations between structural or electronic features and spectroscopic observables. It remains to be seen in future applications how far this approach can take us in establishing reliable connections between structural parameters and spectroscopic properties for larger and more complex oligonuclear transition metal systems. [Pg.344]

In conclusion, it appears that the majority of the most modem force fields do well in predicting structural and dynamical properties within wells on their respective PESs. However, their performance for non-equilibrium properties, such as timescales for conformational interconversion, protein folding, etc., have not yet been fully validated. With the increasing speed of both computational hardware and dynamics algorithms, it should be possible to address this question in the near future. [Pg.99]

Another issue that needs to be addressed is the accurate calculations of the transients of stack operations under variable loading due to changes in power utilization demand and/or under start-up and shut-down conditions. Tracking fast transients, especially during the start-up process, requires at least second order accurate temporal resolution which will impose additional computational cost on stack simulations. It seems that in the near future the best alternative would be to use reduced order physics based models such as those presented in Section 5.2 with appropriate empirical input and experimental validation to get the most benefit out of computational studies. [Pg.167]

A second obvious line of research for the future must be that related to the development and improvement of computer-based simulation of long-term environmental behaviour of radionuclides. Most currently available models are still comparatively simple compared with the physical, chemical and biological complexity of environments they purport to represent but, as noted in Section 13.5, our ability to construct ever more complex conceptual models for predicting the future behaviour of radionuclides is improving. However, the more complex the model, the more demands it places on the basic thermodynamic data and knowledge of likely speciation. The challenge for the future will therefore be to produce high-quality data for model construction and to devise realistic ways to validate those models once produced. [Pg.382]

As discussed earlier, the future of journals is clearly electronic. However, there are many reasons why journal articles are problematic in comparison to other documents such as web pages and databases. First, most chemistry journals (with the notable exception of Chemistry Central Journal, www.journal.chemistrycentral. com) are not open access, and thus the content of articles is restricted by the publishers. Although most universities and large organizations have institutional subscriptions to the popular journals, access usually requires validation on computer IP addresses or the use of private login credentials. Thus, automated access to this information by a computer is difficult. Further, it is unclear whether the terms under which journal articles are made available permit automated processing of the content... [Pg.179]

Despite its great potential, in the near future CFD will not completely replace experimental work or standard approaches currently used by the chemical engineering community. In this connection it is even not sure that CFD is guaranteed to succeed or even be an approach that will lead to improved results in comparison with standard approaches. For single-phase turbulent flows and especially for multiphase flows, it is imperative that the results of CFD analysis somehow be compared with experimental data in order to assess the validity of the physical models and the computational algorithms. In this connection we should mention that only computational results that possess invariance with respect to spatial and temporal discretization should be confronted with experimental data. A CFD model usually gives very detailed information on the temporal and spatial variation of many key quantities (i.e., velocity components, phase volume fractions, temperatures, species concentrations, turbulence parameters), which leads to in-... [Pg.233]


See other pages where Future computer validation is mentioned: [Pg.175]    [Pg.25]    [Pg.543]    [Pg.11]    [Pg.768]    [Pg.644]    [Pg.371]    [Pg.280]    [Pg.444]    [Pg.120]    [Pg.12]    [Pg.301]    [Pg.116]    [Pg.428]    [Pg.37]    [Pg.327]    [Pg.188]    [Pg.221]    [Pg.388]    [Pg.403]    [Pg.204]    [Pg.145]    [Pg.339]    [Pg.331]    [Pg.458]    [Pg.130]    [Pg.158]    [Pg.72]    [Pg.209]    [Pg.11]    [Pg.250]    [Pg.288]    [Pg.817]    [Pg.154]    [Pg.316]    [Pg.215]   
See also in sourсe #XX -- [ Pg.153 , Pg.154 ]




SEARCH



Computational validation

Computers, Future

© 2024 chempedia.info