Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Statistical theories assumptions

Further characterization of P(t) and Pg(t) requires details of the dynamics, here formulated in terms of ergodic-theory-based assumptions. However, since ergodic theory11 considers dynamics on a bounded manifold, it is not directly applicable to the unbounded phase space associated with molecular decay. To resolve this problem we first introduce a related auxiliary bounded system upon which conditions of chaos are imposed, and then determine their effect on the molecular decay details of this construction are provided elsewhere.46 What we show is that adopting this condition leads to a new model for decay, the delayed lifetime gap model (DLGM) for P(t) and Pg(t). The simple statistical theory assumption that Pt(t) and P(t) are exponential with rate ks(E) is shown to arise only as a limiting case. [Pg.395]

Figure 2.15. The limit of detection LOD the minimum signal/noise-ratio necessary according to two models (ordinate) is plotted against log 0(n) under the assumption of evenly spaced calibration points. The three sets of curves are for p = 0.1 (A), 0.05 (B), and 0.02 (C). The correct statistical theory is given by the fine points, while the model presented here is depicted with coarser dots. The widely used S/N = 3. .. 6 models would be represented by horizontals at y = 3. .. 6. Figure 2.15. The limit of detection LOD the minimum signal/noise-ratio necessary according to two models (ordinate) is plotted against log 0(n) under the assumption of evenly spaced calibration points. The three sets of curves are for p = 0.1 (A), 0.05 (B), and 0.02 (C). The correct statistical theory is given by the fine points, while the model presented here is depicted with coarser dots. The widely used S/N = 3. .. 6 models would be represented by horizontals at y = 3. .. 6.
Finally, accurate theoretical kinetic and dynamical models are needed for calculating Sn2 rate constants and product energy distributions. The comparisons described here, between experimental measurements and statistical theory predictions for Cl"+CHjBr, show that statistical theories may be incomplete theoretical models for Sn2 nucleophilic substitution. Accurate kinetic and dynamical models for SN2 nucleophilic substitution might be formulated by introducing dynamical attributes into the statistical models or developing models based on only dynamical assumptions. [Pg.154]

To explain the difference between the experimental results and theory, Doherty et al. (4J have given an empirical and a theoretical hypothesis. The theoretical hypothesis concerns the question of the meaning to be attached to the concept of the "equivalent random link" in the statistical theory of the randomly-jointed chain. According to Doherty et al., the assumption that the optical properties of the chain are describable by a randomly jointed model, using the same value of n, as for the description of stress has no strictly logical foundation. [Pg.470]

The frequency with which the transition state is transformed into products, iT, can be thought of as a typical unimolecular rate constant no barrier is associated with this step. Various points of view have been used to calculate this frequency, and all rely on the assumption that the internal motions of the transition state are governed by thermally equilibrated motions. Thus, the motion along the reaction coordinate is treated as thermal translational motion between the product fragments (or as a vibrational motion along an unstable potential). Statistical theories (such as those used to derive the Maxwell-Boltzmann distribution of velocities) lead to the expression ... [Pg.140]

The QET is not the only theory in the field indeed, several apparently competitive statistical theories to describe the rate constant of a unimolecular reaction have been formulated. [10,14] Unfortunately, none of these theories has been able to quantitatively describe all reactions of a given ion. Nonetheless, QET is well established and even the simplified form allows sufficient insight into the behavior of isolated ions. Thus, we start out the chapter from the basic assumptions of QET. Following this trail will lead us from the neutral molecule to ions, and over transition states and reaction rates to fragmentation products and thus, through the basic concepts and definitions of gas phase ion chemistry. [Pg.14]

When the statistieally sophisticated psychologists realized what I was doing, they had a field day pointing out my failings unjustified assumptions, violations of statistical theory and other mathematical crimes. They talked about ordinal scales versus ratio scales and scolded me for not using analysis of variance instead of Chi-square and Student s T tests of significance. [Pg.70]

Simultaneous IPN. According to the statistical theory of rubber elasticity, the elasticity modulus (Eg), a measure of the material rigidity, is proportional to the concentration of elastically active segments (Vg) in the network [3,4]. For negligible perturbation of the strand length at rest due to crosslinking (a reasonable assumption for the case of a simultaneous IPN), the modulus is given by ... [Pg.62]

Statistical theories treat the decomposition of the reaction complex of ion-molecule interactions in an analogous manner to that employed for unimolecular decomposition reactions.466 One approach is that taken by the quasiequilibrium theory (QET).467 Its basic assumptions are (1) the rate of dissociation of the ion is slow relative to the rate of redistribution of energy among the internal degrees of freedom, both electronic and vibrational, of the ion and (2) each dissociation process may be described as a motion along a reaction coordinate separable from all other internal... [Pg.199]

We note from Table 1.19 that the sums of squares between rows and between columns do not add up to the defined total sum of squares. The difference is called the sum of squares for error, since it arises from the experimental error present in each observation. Statistical theory shows that this error term is an unbiased estimate of the population variance, regardless of whether the hypotheses are true or not. Therefore, we construct an F-ratio using the between-rows mean square divided by the mean square for error. Similarly, to test the column effects, the F-ratio is the be-tween-columns mean square divided by the mean square for error. We will reject the hypothesis of no difference in means when these F-ratios become too much greater than 1. The ratios would be 1 if all the means were identical and the assumptions of normality and random sampling hold. Now let us try the following example that illustrates two-way analysis of variance. [Pg.75]

The essential assumption of all statistical theories is that during the long lifetime the total available energy is uniformly distributed among... [Pg.250]

RRKM theory, an approach to the calculation of the rate constant of indirect reactions that, essentially, is equivalent to transition-state theory. The reaction coordinate is identified as being the coordinate associated with the decay of an activated complex. It is a statistical theory based on the assumption that every state, within a narrow energy range of the activated complex, is populated with the same probability prior to the unimolecular reaction. The microcanonical rate constant k(E) is given by an expression that contains the ratio of the sum of states for the activated complex (with the reaction coordinate omitted) and the total density of states of the reactant. The canonical k(T) unimolecular rate constant is given by an expression that is similar to the transition-state theory expression of bimolecular reactions. [Pg.169]

The basic assumption in statistical theories is that the initially prepared state, in an indirect (true or apparent) unimolecular reaction A (E) —> products, prior to reaction has relaxed (via IVR) such that any distribution of the energy E over the internal degrees of freedom occurs with the same probability. This is illustrated in Fig. 7.3.1, where we have shown a constant energy surface in the phase space of a molecule. Note that the assumption is equivalent to the basic equal a priori probabilities postulate of statistical mechanics, for a microcanonical ensemble where every state within a narrow energy range is populated with the same probability. This uniform population of states describes the system regardless of where it is on the potential energy surface associated with the reaction. [Pg.184]

If the system under study consists of only weakly interacting or noninteracting molecules (gas phase) the thermodynamical (or other observable) quantities can be obtained from single molecule calculations by relying upon the corresponding statistical theory and the assumptions inherent for it. In this case the number of atoms N in the molecule can be thought to be the number of atoms in the entire system. The procedure is as follows. First, a search for local minima of the potential energy... [Pg.5]

The elaborate statistical theory of phase transformations of chemical reaction (1) makes possible the explanation and substantiation of formation of phases of fulleride hydrides and then of fullerite with increase in temperature. The calculation of phases free energies has been performed using the rough simplified assumptions. The dependence of free energies of phases on their composition, temperature, order parameter in fullerenes subsystem, energetic constants has been found. The evaluation of energetic constants has been carried out with the use of experimental data for concentration and temperature ranges of each phase realization. [Pg.18]

All the theories require some assumptions to be made about the nature of some critical reaction intermediate which may be an activated complex located at a barrier in the potential surface or at a barrier formed in the long-range attractive potential by the orbital angular momentum of the reactants or product. The determination of the correct location for the critical transition state is a major problem in applying statistical theories to chemical reactions. The underlying assumption of statistical theories is that once the transition state is passed in the direction of reaction products, it is not recrossed. The nature of the transition state determines... [Pg.379]

The statistical theories provide a relatively simple model of chemical reactions, as they bypass the complicated problem of detailed single-particle and quantum mechanical dynamics by introducing probabilistic assumptions. Their applicability is, however, connected with the collisional mechanism of the process in question, too. The statistical phase space theories, associated mostly with the work of Light (in Ref. 6) and Nikitin (see Ref. 17), contain the assumption of a long-lived complex formation and are thus best suited for the description of complex-mode processes. On the other hand, direct character of the process is an implicit dynamical assumption of the transition-state theory. [Pg.266]

The last issue is the possibility of nonstatistical dynamics. TST and RRKM are statistical theories. If this assumption is true, then both theories can predict reaction rates, leading to what is called statistical dynamics. Of relevance to this chapter is that for statistical dynamics to occur, any intermediates along the reaction pathway must live long enough for energy to be statistically distributed over all of the vibrational modes. If this redistribution cannot occur, then TST and RRKM will fail to predict reaction rates. A principle characteristic of nonstatistical dynamics is a reaction rate much faster than that predicted by statistical theories. [Pg.507]

Condensed Phase Isotope Effects. The chromatographic results discussed later in the paper will be interpreted with the use of the statistical theory of isotope effects in condensed systems attributed to Bigeleisen (6). With the application of a cell model to the condensed phase and the assumption of harmonic frequencies for all 3N modes the theory leads to ... [Pg.100]

An equation for the modulus of ideal rubber was derived from statistical theory that can be credited to several scientists, including Flory, and Guth and James (Sperling, 1986). A key assumption in derivation of the eqmtion is that the networks are Gaussian. [Pg.347]

In cases like D2CO or NO2 comparison with experimental data on a state-specific level are ruled out entirely and one has to retreat to more averaged quantities like the average dissociation rate, (fc), and the distribution of rates, Q(k). If the dynamics is ergodic — the basic assumption of all statistical theories — one can derive a simple expression for Q k), which had been established in nuclear physics in order to describe the neutron emission rates of heavy nuclei [280]. These concepts have since developed into the field of random matrix theory (RMT) and statistical spectroscopy [281-283] and have also found applications in the dissociation of energized molecules [121,284-286]. [Pg.184]

This, model contradicts with the statistical theory of rubber elasticity, and the artificial assumption oia = 3k TJb has been used. [Pg.550]

The main focus in the application of this beautiful statistical theory is usually placed on the identification of the transition state, the number of which is usually supposed to be one for a single reaction channel, and the evaluation of the relevant partition functions and related quantities. On the other hand, people are often interested in the theoretical foundation of this statistical assumption... [Pg.34]

Statistical theory must also be held not only with respect but also with healthy skepticism. It should be remembered that the development of statistics, as they have come to be applied to clinical trials, has arisen from a variety of nonmammalian biological sources. Experimental agriculture stimulated the early giants (Drs. Fisher and Yates) to explore probability density functions. While epidemiological studies have confirmed much that is similar in human populations, it is unknown whether these probability density functions apply uniformly to all disease states. Any statistical test that we employ makes assumptions that are usually not stated. [Pg.105]

Statistical theory teaches that under the assumption that the population means of the two groups are the same (i.e. if Hq is true), the distribution of variable T depends only on the sample size but not on the value of the common mean or on the measurements population variance and thus can be tabulated independently of the particulars of any given experiment. This is the so-called Student s f-distribution. Using tables of the f-distribution, we can calculate the probability that a variable T calculated as above assumes a value greater or equal to 4.7, the value obtained in our example, given that H0 is true. This probability is <0.0001. Thus, if H0 is true, the result obtained in our experiment is extremely unlikely, although not impossible. We are forced to choose between two possible explanations to this. One is that a very unlikely event occurred. The second is that the result of our experiment is not a fluke, rather, the difference Mb — Ma is a positive number, sufficiently large to make the probability of this outcome a likely event. We elect the latter explanation and reject H0 in favor of the alternative hypothesis Hx. [Pg.328]


See other pages where Statistical theories assumptions is mentioned: [Pg.848]    [Pg.1069]    [Pg.2115]    [Pg.300]    [Pg.10]    [Pg.225]    [Pg.453]    [Pg.31]    [Pg.346]    [Pg.980]    [Pg.46]    [Pg.49]    [Pg.50]    [Pg.224]    [Pg.225]    [Pg.56]    [Pg.554]    [Pg.130]    [Pg.136]    [Pg.5]    [Pg.119]    [Pg.178]    [Pg.140]    [Pg.201]   
See also in sourсe #XX -- [ Pg.392 ]




SEARCH



Assumption statistical

Theories statistical theory

© 2024 chempedia.info