Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Statistical assumptions

Also using chemical space as a framework, Agrafiotis [118] presented a very fast method for diversity analysis on the basis of simple assumptions, statistical sampling of outcomes, and principles of probability theory. This method presumes that the optimal coverage of a chemical space is that of uniform coverage. The central limit theorem of probability theory... [Pg.748]

McRae s approach and other similar approaches have been criticized by Brady and Carr, and more recently by Klamt, on the grounds that the orientational and electronic parts of the polarizability are not independent. Details are not given here but the reader is referred to their work, and to that of Tomasi et al. Brady and Carr, however, point out that in spite of the erroneous assumptions, statistically superior correlations are generally obtained with McRae s equation, compared to the corrected formulae. " ... [Pg.355]

Since the accuracy of experimental data is frequently not high, and since experimental data are hardly ever plentiful, it is important to reduce the available data with care using a suitable statistical method and using a model for the excess Gibbs energy which contains only a minimum of binary parameters. Rarely are experimental data of sufficient quality and quantity to justify more than three binary parameters and, all too often, the data justify no more than two such parameters. When data sources (5) or (6) or (7) are used alone, it is not possible to use a three- (or more)-parameter model without making additional arbitrary assumptions. For typical engineering calculations, therefore, it is desirable to use a two-parameter model such as UNIQUAC. [Pg.43]

A quantitative theory of rate processes has been developed on the assumption that the activated state has a characteristic enthalpy, entropy and free energy the concentration of activated molecules may thus be calculated using statistical mechanical methods. Whilst the theory gives a very plausible treatment of very many rate processes, it suffers from the difficulty of calculating the thermodynamic properties of the transition state. [Pg.402]

We start with the Helmholtz integral and we use the Kirchhoff treatment, as Beckmarm and Spizzichino did [10]. Likewise, we shall make almost the same assumptions about the statistics of the radii h(cp,z) in order to find a way to deal with the integrals involved in the calculation. [Pg.663]

Thus the kinetic and statistical mechanical derivations may be brought into identity by means of a specific series of assumptions, including the assumption that the internal partition functions are the same for the two states (see Ref. 12). As discussed in Section XVI-4A, this last is almost certainly not the case because as a minimum effect some loss of rotational degrees of freedom should occur on adsorption. [Pg.609]

It seems appropriate to assume the applicability of equation (A2.1.63) to sufficiently dilute solutions of nonvolatile solutes and, indeed, to electrolyte species. This assumption can be validated by other experimental methods (e.g. by electrochemical measurements) and by statistical mechanical theory. [Pg.360]

In the statistical description of ununolecular kinetics, known as Rice-Ramsperger-Kassel-Marcus (RRKM) theory [4,7,8], it is assumed that complete IVR occurs on a timescale much shorter than that for the unimolecular reaction [9]. Furdiemiore, to identify states of the system as those for the reactant, a dividing surface [10], called a transition state, is placed at the potential energy barrier region of the potential energy surface. The assumption implicit m RRKM theory is described in the next section. [Pg.1008]

RRKM theory assumes a microcanonical ensemble of A vibrational/rotational states within the energy interval E E + dE, so that each of these states is populated statistically with an equal probability [4]. This assumption of a microcanonical distribution means that the unimolecular rate constant for A only depends on energy, and not on the maimer in which A is energized. If N(0) is the number of A molecules excited at / =... [Pg.1008]

The hypersurface fomied from variations in the system s coordinates and momenta at//(p, q) = /Tis the microcanonical system s phase space, which, for a Hamiltonian with 3n coordinates, has a dimension of 6n -1. The assumption that the system s states are populated statistically means that the population density over the whole surface of the phase space is unifomi. Thus, the ratio of molecules at the dividing surface to the total molecules [dA(qi, p )/A]... [Pg.1011]

When g = 1 the extensivity of the entropy can be used to derive the Boltzmann entropy equation 5 = fc In W in the microcanonical ensemble. When g 1, it is the odd property that the generalization of the entropy Sq is not extensive that leads to the peculiar form of the probability distribution. The non-extensivity of Sq has led to speculation that Tsallis statistics may be applicable to gravitational systems where interaction length scales comparable to the system size violate the assumptions underlying Gibbs-Boltzmann statistics. [4]... [Pg.199]

Do we expect this model to be accurate for a dynamics dictated by Tsallis statistics A jump diffusion process that randomly samples the equilibrium canonical Tsallis distribution has been shown to lead to anomalous diffusion and Levy flights in the 5/3 < q < 3 regime. [3] Due to the delocalized nature of the equilibrium distributions, we might find that the microstates of our master equation are not well defined. Even at low temperatures, it may be difficult to identify distinct microstates of the system. The same delocalization can lead to large transition probabilities for states that are not adjacent ill configuration space. This would be a violation of the assumptions of the transition state theory - that once the system crosses the transition state from the reactant microstate it will be deactivated and equilibrated in the product state. Concerted transitions between spatially far-separated states may be common. This would lead to a highly connected master equation where each state is connected to a significant fraction of all other microstates of the system. [9, 10]... [Pg.211]

Multiple linear regression is strictly a parametric supervised learning technique. A parametric technique is one which assumes that the variables conform to some distribution (often the Gaussian distribution) the properties of the distribution are assumed in the underlying statistical method. A non-parametric technique does not rely upon the assumption of any particular distribution. A supervised learning method is one which uses information about the dependent variable to derive the model. An unsupervised learning method does not. Thus cluster analysis, principal components analysis and factor analysis are all examples of unsupervised learning techniques. [Pg.719]

Examining transition state theory, one notes that the assumptions of Maxwell-Boltzmann statistics are not completely correct because some of the molecules reaching the activation energy will react, lose excess vibrational energy, and not be able to go back to reactants. Also, some molecules that have reacted may go back to reactants again. [Pg.166]

The rotational isomeric state (RIS) model assumes that conformational angles can take only certain values. It can be used to generate trial conformations, for which energies can be computed using molecular mechanics. This assumption is physically reasonable while allowing statistical averages to be computed easily. This model is used to derive simple analytic equations that predict polymer properties based on a few values, such as the preferred angle... [Pg.308]

Boltzmann distribution statistical distribution of how many systems will be in various energy states when the system is at a given temperature Born-Oppenbeimer approximation assumption that the motion of electrons is independent of the motion of nuclei boson a fundamental particle with an integer spin... [Pg.361]

In attempting to reach decisions, it is useful to make assumptions or guesses about the populations involved. Such assumptions, which may or may not be true, are called statistical hypotheses and in general are statements about the probability distributions of the populations. A common procedure is to set up a null hypothesis, denoted by which states that there is no significant difference between two sets of data or that a variable exerts no significant effect. Any hypothesis which differs from a null hypothesis is called an alternative hypothesis, denoted by Tfj. [Pg.200]

The fact that each sample variance is related to its own population variance means that the sample variance being used for the calculation need not come from the same population. This is a significant departure from the assumptions inherent in the z, r, and statistics. [Pg.204]

Attempts have been made to devise mathematical functions to represent the distributions that are found experimentally. The mathematical treatment is necessarily based on the assumption that the number of particles in the sample is large enough for statistical considerations to be applicable. With the SOO-member sample of the previous section one could not expect any more than approximate agreement between mathematical prediction and experiment. [Pg.27]

The "feedback loop in the analytical approach is maintained by a quality assurance program (Figure 15.1), whose objective is to control systematic and random sources of error.The underlying assumption of a quality assurance program is that results obtained when an analytical system is in statistical control are free of bias and are characterized by well-defined confidence intervals. When used properly, a quality assurance program identifies the practices necessary to bring a system into statistical control, allows us to determine if the system remains in statistical control, and suggests a course of corrective action when the system has fallen out of statistical control. [Pg.705]

The principal tool for performance-based quality assessment is the control chart. In a control chart the results from the analysis of quality assessment samples are plotted in the order in which they are collected, providing a continuous record of the statistical state of the analytical system. Quality assessment data collected over time can be summarized by a mean value and a standard deviation. The fundamental assumption behind the use of a control chart is that quality assessment data will show only random variations around the mean value when the analytical system is in statistical control. When an analytical system moves out of statistical control, the quality assessment data is influenced by additional sources of error, increasing the standard deviation or changing the mean value. [Pg.714]

Statistical considerations make it possible to test the assumption of independent additions. Let us approach this topic by considering an easier problem coin tossing. Under conditions where two events are purely random-as in tossing a fair coin-the probability of a specific sequence of outcomes is given by the product of the probabilities of the individual events. The probability of tossing a head followed by a head-indicated HH-is given by... [Pg.454]

Use of random flight statistics to derive rg for the coil assumes the individual segments exclude no volume from one another. While physically unrealistic, this assumption makes the derivation mathematically manageable. Neglecting this volume exclusion means that coil dimensions are underestimated by the random fight model, but this effect can be offset by applying the result to a solvent in which polymer-polymer contacts are somewhat favored over polymer-solvent contacts. [Pg.560]

Distribution of Carbon. Estimation of the amount of biomass carbon on the earth s surface is a problem in global statistical analysis. Although reasonable projections have been made using the best available data, maps, surveys, and a host of assumptions, the vaHdity of the results is impossible to support with hard data because of the nature of the problem. Nevertheless, such analyses must be performed to assess the feasibiHty of biomass energy systems and the gross types of biomass available for energy appHcations. [Pg.9]

Decision Process. In many cases, the decision regarding the need for exposure reduction measures is obvious and no formal statistical procedure is necessary. However, as exposure criteria are lowered, and control becomes more difficult, close calls become more common, and a logical decision-making process is needed. A typical process is shown in Eigure 2. Even when decision making is easy it is useful to remember the process and the assumptions involved. Based on an evaluation, decisions are made regarding control. The evaluation and decision steps caimot be separated because the conduct of the evaluation, the strategy, measurement method, and data collection are all a part of the decision process. [Pg.108]

Da.ta. Ana.lysls. First, the raw data must be converted to concentrations over an appropriate time span. When sample periods do not correspond to the averaging time of the exposure limit, some assumptions must be made about unsampled periods. It may be necessary to test the impact of various assumptions on the final decision. Next, some test statistics (confidence limit, etc) (Fig. 3) are calculated and compared to a test criteria to make an inference about a hypotheses. [Pg.109]

Interpreta.tlon, Whereas statistical tests estabhsh whether results are or are not different from (over) an exposure criteria, the generaUty of this outcome must be judged. What did the samples represent May the outcome, which is inferred to cover both sampled and unsampled periods, be legitimately extrapolated into the future In other words, is the usual assumption of a stationary mean vaUd AH of these questions are answered by judgment and experience appHed to the observations made at the time of sampling, and the answers are used to interpret the quantitative results. [Pg.109]

D. R. Cox, P/anning of Experiments,]ohxi Wiley Sons, Inc., New York, 1958. This book provides a simple survey of the principles of experimental design and of some of the most usehil experimental schemes. It tries "as far as possible, to avoid statistical and mathematical technicalities and to concentrate on a treatment that will be intuitively acceptable to the experimental worker, for whom the book is primarily intended." As a result, the book emphasizes basic concepts rather than calculations or technical details. Chapters are devoted to such topics as "Some key assumptions," "Randomization," and "Choice of units, treatments, and observations."... [Pg.524]

Statistical Criteria. Sensitivity analysis does not consider the probabiUty of various levels of uncertainty or the risk involved (28). In order to treat probabiUty, statistical measures are employed to characterize the probabiUty distributions. Because most distributions in profitabiUty analysis are not accurately known, the common assumption is that normal distributions are adequate. The distribution of a quantity then can be characterized by two parameters, the expected value and the variance. These usually have to be estimated from meager data. [Pg.451]


See other pages where Statistical assumptions is mentioned: [Pg.381]    [Pg.321]    [Pg.381]    [Pg.321]    [Pg.61]    [Pg.175]    [Pg.328]    [Pg.582]    [Pg.610]    [Pg.660]    [Pg.848]    [Pg.1069]    [Pg.1376]    [Pg.1421]    [Pg.2109]    [Pg.2115]    [Pg.203]    [Pg.71]    [Pg.166]    [Pg.95]    [Pg.153]    [Pg.133]    [Pg.316]   
See also in sourсe #XX -- [ Pg.5 , Pg.96 , Pg.129 ]




SEARCH



Factor analysis statistical assumptions

Statistical theories assumptions

© 2024 chempedia.info