Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Distribution function theorem

In general, the distribution function changes in time because of the underlying motion of the hard-spheres. Consider first the nonphysical case where there are no collisions. Phase-space conservation, or Louiville s Theorem [bal75], assures us that... [Pg.476]

Boltzmann s H-Theorem. —One of the most striking features of transport theory is seen from the result that, although collisions are completely reversible phenomena (since they are based upon the reversible laws of mechanics), the solutions of the Boltzmann equation depict irreversible phenomena. This effect is most clearly seen from a consideration of Boltzmann s IZ-function, which will be discussed here for a gas in a uniform state (no dependence of the distribution function on position and no external forces) for simplicity. [Pg.17]

The results just obtained are special cases of a theorem that shows how a large class of time averages can be calculated in terms of the distribution function. Before demonstrating this theorem, it will be convenient for us to first discuss some useful properties of distribution functions. The most important of these are... [Pg.107]

Random Variables.—An interesting and useful interpretation of the theorem of averages is to regard it as a means for calculating the distribution functions of certain time functions Y(t) that are related to a time function X(t) whose distribution function is known. More precisely, if Y(t) is of the form Y(t) = [X(t)], then the theorem of averages enables us to calculate the distribution function of Y(t)... [Pg.114]

The notion of the distribution function of a random variable is also useful in connection with problems where it is not possible or convenient to subject the underlying function X(t) to direct measurements, but where certain derived time functions of the form Y(t) = [X(t)] are available for observation. The theorem of averages then tdls us what averages of X(t) it is possible to calculate when all that is known is the distribution function of . The answer is quite simple if / denotes (almost) apy real-valuqd function of a real variable, then all X averages of the form... [Pg.118]

One way of avoiding the difficulty of having to measure, or in some way specify, a complete distribution function is to restrict attention to those theorems or relationships that do not depend on the detailed shape of the distribution function, but rather depend only on certain more easily measured parameters of the distribution function. A convenient and widely used set of such parameters is the momenta of the distribution function. In the case of first order distribution... [Pg.119]

All averages of the form (3-96) can be calculated in terms of a canonical set of averages called joint distribution functions by means of an extension of the theorem of averages proved in Section 3.3. To this end, we shall define the a order distribution function of X for time spacings rx < r2 < < by the equation,... [Pg.132]

The multidimensional theorem of averages can be used to calculate the higher-order joint distribution functions of derived sets of time functions, each of which is of the form... [Pg.141]

Once again, it should be emphasized that the functional form of a set of random variables is important only insofar as it enables us to calculate their joint distribution function in terms of other known distribution functions. Once the joint distribution function of a group of random variables is known, no further reference to their fractional form is necessary in order to use the theorem of averages for the calculation of any time average of interest in connection with the given random variables. [Pg.144]

Our next result concerns the central limit theorem, which places in evidence the remarkable behavior of the distribution function of when n is a large number. We shall now state and sketch the proof of a version of the central limit theorem that is pertinent to sums of identically distributed [p0i(x) = p01(a ), i — 1,2, ], statistically independent random variables. To simplify the statement of the theorem, we shall introduce the normalized sum s defined by... [Pg.157]

The central limit theorem thus states the remarkable fact that the distribution function of the normalized sum of identically distributed, statistically independent random variables approaches the gaussian distribution function as the number of summands approaches infinity—... [Pg.157]

Notice that those distribution functions that satisfy Eq. (4-179) still constitute a convex set, so that optimization of the E,R curve is still straightforward by numerical methods. It is to be observed that the choice of an F(x) satisfying a constraint such as Eq. (4-179) defines an ensemble of codes the individual codes in the ensemble will not necessarily satisfy the constraint. This is unimportant practically since each digit of each code word is chosen independently over the ensemble thus it is most unlikely that the average power of a code will differ drastically from the average power of the ensemble. It is possible to combine the central limit theorem and the techniques used in the last two paragraphs of Section 4.7 to show that a code exists for which each code word satisfies... [Pg.242]

The important information about the properties of smectic layers can be obtained from the relative intensities of the (OOn) Bragg peaks. The electron density profile along the layer normal is described by a spatial distribution function p(z). The function p(z) may be represented as a convolution of the molecular form factor F(z) and the molecular centre of mass distribution f(z) across the layers [43]. The function F(z) may be calculated on the basis of a certain model for layer organization [37, 48]. The distribution function f(z) is usually expanded into a Fourier series f(z) = cos(nqoz), where the coefficients = (cos(nqoz)) are the de Gennes-McMillan translational order parameters of the smectic A phase. According to the convolution theorem, the intensities of the (OOn) reflections from the smectic layers are simply proportional to the square of the translational order parameters t ... [Pg.209]

One may also show that MPC dynamics satisfies an H theorem and that any initial velocity distribution will relax to the Maxwell-Boltzmann distribution [11]. Figure 2 shows simulation results for the velocity distribution function that confirm this result. In the simulation, the particles were initially uniformly distributed in the volume and had the same speed v = 1 but different random directions. After a relatively short transient the distribution function adopts the Maxwell-Boltzmann form shown in the figure. [Pg.95]

The general problem is then to estimate 0 and u knowing the values of the measurements, y, and the probability distribution function of e (measurement error). If P(e) is the error distribution, then y will be distributed according to P y - x 9, u). Thus, according to Bayes theorem, (Alburquerque and Biegler, 1996), the posterior... [Pg.197]

P. W. Ayers, S. Golden, and M. Levy, Generalizations of the Hohenberg—Kohn theorem I. Legendre transform constructions of variational principles for density matrices and electron distribution functions. J. Chem. Phys. 124, 054101 (2006). [Pg.480]

The Boltzmann //-theorem generalizes the condition that with a state ol a system represented by its distribution function /. a quantity H. defined as the statistical average of In /, approaches a minimum when equilibrium is reached. This conforms lo the Boltzmann hypothesis of distribution in the above in that S = —kH accounts for equilibrium as a consequence of collisions which change the distribution toward that of equilibrium conditions. [Pg.581]

The interesting thing to note is that G(co) is none other than the power spectrum of the time-correlation function (see (Eq. 144)). Bochner s theorem gives us reason to regard the power spectrum as a probability distribution function. The same conclusion applies to the memory functions corresponding to C(t). [Pg.55]

From Bochner s theorem it is seen that power spectra are everywhere positive and bounded, and furthermore, time-correlation functions have power spectra that can be regarded as probability distribution functions. [Pg.57]


See other pages where Distribution function theorem is mentioned: [Pg.498]    [Pg.2]    [Pg.114]    [Pg.158]    [Pg.241]    [Pg.778]    [Pg.85]    [Pg.276]    [Pg.639]    [Pg.643]    [Pg.644]    [Pg.66]    [Pg.56]    [Pg.162]    [Pg.83]    [Pg.21]    [Pg.112]    [Pg.63]    [Pg.25]    [Pg.619]    [Pg.63]    [Pg.75]    [Pg.65]    [Pg.184]    [Pg.93]    [Pg.221]    [Pg.245]   
See also in sourсe #XX -- [ Pg.31 , Pg.32 , Pg.33 ]




SEARCH



Function theorem

Some Important Theorems for Distribution Functions

© 2024 chempedia.info