Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Full statistical method

Very early force fields were used in an attempt to calculate structures, enthalpies of formation, and vibrational spectra, but it was soon found that accuracy suffered severely in either the structure-energy calculations or the vibrational spectra. Force constants were, on the whole, not transferable from one field to another. The result was that early force fields evolved so as to calculate either structure and energy or spectra, but not both. [Pg.161]


The value of the torsional energy increment has been variously estimated, but TORS = 0.42 kcal mol was settled on for the bond contribution method in MM3, In the full statistical method (see below), low-frequency torsional motion should be calculated along with all the others so the empirical TORS inererneut should be zero. In fact, TORS is not zero (Allinger, 1996). It appears that the TORS inererneut is a repository for an energy eiror or errors in the method that are as yet unknown. [Pg.154]

Representativeness can be examined from two aspects statistical and deterministic. Any statistical test of representativeness is lacking becau.se many histories are needed for statistical significance. In the absence of this, PSAs use statistical methods to synthesize data to represent the equipment, operation, and maintenance. How well this represents the plant being modeled is not known. Deterministic representativeness can be answered by full-scale tests on like equipment. Such is the responsibility of the NSSS vendor, but for economic reasons, recourse to simplillcd and scaled models is often necessary. System success criteria for a PSA may be taken from the FSAR which may have a conservative bias for licensing. Realism is more expensive than conservatism. [Pg.379]

We will describe an accurate statistical method that includes a full assessment of error in the overall calibration process, that is, (I) the confidence interval around the graph, (2) an error band around unknown responses, and finally (3) the estimated amount intervals. To properly use the method, data will be adjusted by using general data transformations to achieve constant variance and linearity. It utilizes a six-step process to calculate amounts or concentration values of unknown samples and their estimated intervals from chromatographic response values using calibration graphs that are constructed by regression. [Pg.135]

We cannot possibly provide a detailed account of the full range of statistical methods and strategies that constitute the armamentarium of modern science. Yet, there is value in considering some basic statistical principles. [Pg.648]

Optimization techniques may be classified as parametric statistical methods and nonparametric search methods. Parametric statistical methods, usually employed for optimization, are full factorial designs, half factorial designs, simplex designs, and Lagrangian multiple regression analysis [21]. Parametric methods are best suited for formula optimization in the early stages of product development. Constraint analysis, described previously, is used to simplify the testing protocol and the analysis of experimental results. [Pg.33]

Full statistical evaluation of the calibration graph provides useful data about the method s performance characteristics over the applied calibration range such as the standard error of the procedure, sx, or the standard error of estimate, sy. [Pg.308]

Statistical methods are based on the single concept of variability. It is through this fundamental concept that a basis is determined for design of experiments and analysis of data. Full utilization of this concept makes it possible to derive maximum information from a given set of data and to minimize the amount of data necessary to derive specific information. [Pg.741]

MoQSAR represents a new way of deriving QSARs. QSAR is treated as a multiobjective optimisation problem that comprises a number of competing objectives, such as model accuracy, complexity and chemical interpretability. The result is a family of QSAR models where each model represents a different compromise in the objectives. Typically, MoQSAR is able to find models that are at least as good as those found using standard statistical methods. The method will also find models where accuracy is traded with other objectives such as chemical interpretability. When presented with the full range of models the medicinal chemist is able to select one that represents the best compromise over all objectives. [Pg.150]

Fortunately, statistical methods exist that may be used to help derive the coefficients, thus minimizing the work. The full data matrix is employed to find the set of coefficient values, ai, using the requirement that the variance, s, (Eq. [15]) is a minimum. [Pg.228]

To simplify hardness determination, Kirsch and Drennen [67] presented a new approach to tablet hardness determination that relies on simpler, more understandable statistical methods and provides the essence of the full spectral multivariate methods, but does not depend upon individual wavelengths of observation. Specifically, the previous hypotheses regarding the spectroscopic phenomena that permit NIR-based hardness determinations were examined in greater detail. Additionally, a robust method employing simple statistics based upon the calculation of a spectral best fit was developed to permit NIR tablet hardness prediction across a range of drug concentrations. [Pg.90]

The excessive number of animals per test group is also pointed out and tackled with recommendations on the reduction of animal numbers. In particular, a better use of statistics in the design of experiments is recommended in order to reach an optimal compromise between animal number and variability of the results. The use of homogenous populations is advocated as a means to minimize interindividual variability if physiological variation between individual animals can be controlled, and statistical methods used to exploit this control to the full, the number of animals necessary for assay purposes can be dramatically reduced. This results, for instance, in the use of only one breed of rats for one set of tests, rather than a mixture of different breeds with the objective of mimicking the phenotypic variability of humans. This latter approach would result mainly on the study of the differences between breeds rather than the actual effects of the compound. [Pg.15]

A flow diagram on how to analyze and evaluate longterm stability data for appropriate quantitative test attributes from a study with a multifactor full or reduced design is provided in Appendix A. The statistical method used for data analysis should consider the stability study design to provide a valid statistical inference for the estimated retest period or shelf fife. [Pg.69]

Validation without an independent test set. Each application of the adaptive wavelet algorithm has been applied to a training set and validated using an independent test set. If there are too few observations to allow for an independent testing and training data set, then cross validation could be used to assess the prediction performance of the statistical method. Should this be the situation, it is necessary to mention that it would be an extremely computational exercise to implement a full cross-validation routine for the AWA. That is. it would be too time consuming to leave out one observation, build the AWA model, predict the deleted observation, and then repeat this leave-one-out procedure separately. In the absence of an independent test set, a more realistic approach would be to perform cross-validation using the wavelet produced at termination of the AWA, but it is important to mention that this would not be a full validation. [Pg.200]

The observation of a system over a specific time period at atomic level causes a considerable data flood. It requires the calculation and recording of the full MD trajectory including all solvent molecules. Yet even more time consuming are analyses of the emerging data and, hence, efficient statistical methods are of utmost importance for elucidating information. Several different water models have been developed for the explicit consideration of solvation effects in MD simulations [62]. [Pg.271]

There is a considerable impetus to predict accurately protein structures from sequence information because of the protein sequence/structure deficit as a consequence of the genome and full-length cDNA sequencing projects. The molecular mechanical (MM) approach to modeling of protein structures has been discussed in section 9.2, and the protein secondary structure prediction from sequence by statistical methods has been treated in section 9.5. The prediction of protein structure using bioinformatic resources will be described in this subsection. The approaches to protein structure predictions from amino acid sequences (Tsigelny, 2002 Webster, 2000) include ... [Pg.616]

The goal of the development of methods for a full statistical mapping of complex damage causes is to reduce expensive reliability prototype tests and technical analysis of field damage cases. Key aspects of... [Pg.797]

The description of complex, multiple damage causes with the industrial standard method for reliabiUty analysis does not achieve satisfactory results. The development of a rehabdity analysis approach for full statistical mapping of complex damage causes with the goal of reduction of expenses is based on a general process, which contains a strucmred sequence of methods and algorithms to analyse product reliabihty... [Pg.799]

A complete treatment of a small sample of molecular liquid from first principles is still beyond the reach of computationa] chemistry. In order to better understand what happens in a liquid or a solution at the molecular levels, one is (breed to adopt a simplified approach. One may perform a full statistical treatment of the sample, and then the representation of the molecule and the intermolecular forces have to be rather schematic. One may describe this approach as a true liquid of model molecules. Conversely, one may w ish to analyze in greater detail the structural modifications of a molecule - the solute or a particular molecule of a pure liquid - when it is placed in a liquid environment. One is then led to adopt a quantum chemical treatment of the molecule and to replace its actual surroundings by a simple medium, usually a continuum, having the averaged properties of the liquid. One may speak of a model liquid of true molecules. I he Self-( onsistent Reaction Field method belongs to this approach. [Pg.79]

To simplify hardness determination, Kirsch and Drennen [104] presented a new approach to tablet hardness determination that relies on simpler, more understandable statistical methods and provides the essence of the full spectral multivariate methods, but does not depend upon... [Pg.81]


See other pages where Full statistical method is mentioned: [Pg.161]    [Pg.321]    [Pg.161]    [Pg.321]    [Pg.18]    [Pg.535]    [Pg.59]    [Pg.22]    [Pg.88]    [Pg.250]    [Pg.153]    [Pg.300]    [Pg.99]    [Pg.15]    [Pg.410]    [Pg.618]    [Pg.99]    [Pg.133]    [Pg.502]    [Pg.6]    [Pg.54]    [Pg.1223]    [Pg.435]    [Pg.2]    [Pg.36]    [Pg.25]    [Pg.435]    [Pg.80]    [Pg.61]    [Pg.121]    [Pg.487]    [Pg.131]   
See also in sourсe #XX -- [ Pg.161 ]




SEARCH



Statistical methods

© 2024 chempedia.info