Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Most probable distribution concept

Theoretical efforts a step beyond simply fitting standard statistical curves to fragment size distribution data have involved applications of geometric statistical concepts, i.e., the random partitioning of lines, areas, or volumes into the most probable distribution of sizes. The one-dimensional problem is reasonably straightforward and has been discussed by numerous authors... [Pg.295]

The linear response function [3], R(r, r ) = (hp(r)/hv(r ))N, is used to study the effect of varying v(r) at constant N. If the system is acted upon by a weak electric field, polarizability (a) may be used as a measure of the corresponding response. A minimum polarizability principle [17] may be stated as, the natural direction of evolution of any system is towards a state of minimum polarizability. Another important principle is that of maximum entropy [18] which states that, the most probable distribution is associated with the maximum value of the Shannon entropy of the information theory. Attempts have been made to provide formal proofs of these principles [19-21], The application of these concepts and related principles vis-a-vis their validity has been studied in the contexts of molecular vibrations and internal rotations [22], chemical reactions [23], hydrogen bonded complexes [24], electronic excitations [25], ion-atom collision [26], atom-field interaction [27], chaotic ionization [28], conservation of orbital symmetry [29], atomic shell structure [30], solvent effects [31], confined systems [32], electric field effects [33], and toxicity [34], In the present chapter, will restrict ourselves to mostly the work done by us. For an elegant review which showcases the contributions from active researchers in the field, see [4], Atomic units are used throughout this chapter unless otherwise specified. [Pg.270]

To justify the concept of existence of a most probable distribution of microstates overwdielming all other possible distributions [i.e., all other distributions consistent with the constraints posed by Eqs. (2.7)] a treatment originally due to Darw in and Fowler may be employed [18]. It starts by defining the zero of the energy scale and energy unit such that the discrete energy spectrum j(N. o represented by a set of nonnegative... [Pg.44]

In Chapter 2, we developed statistical thermodynamics as the central theory that enables ns in principle to calculate thermophysical properties of macroscopic confined flriids. A key feature of statistical thermodynamics is an enormous reduction of information that takes place as one goes from the microscopic world of electrons, photons, atoms, or molecules to the macroscopic world at which one performs measurements of thermophysical properties of interest. This information reduction is effected by statistical concepts such as the most probable distribution of quantum states (see Section 2.2.1). [Pg.95]

The third problem is like the confusion caused in MT by maintaining the concept of the Ether. Most practitioners of QM think about microscopic systems in terms of the principles of QM probability distributions, superposition principle, uncertainty relations, complementarity principle, correspondence principle, wave function collapse. These principles are an approximate summary of what QM really is, and following them without checking whether the Schrddinger equation actually confirms them does lead to error. [Pg.26]

In practice decision makers typically are risk averse and the expected value approach does not take into account the variability of the solutions obtained under the probability distributions or scenarios considered for the uncertain parameters. Rosenhead et al. (1972) introduced the aspect of robustness as a criterion for strategic planning to address this issue. Building on the notion of robustness, Mulvey et al. (1995) developed the concept of robust optimization distinguishing between two different types of robust models. A model is solution robust if the solution obtained remains close to optimality for any realization of the uncertain parameters. The model itself is robust if it remains (almost) feasible for any realization of the uncertain parameters (model robust).36 Here, only solution robustness is of interest as the most important elements of uncertainty in production network design, namely demand volumes, costs, prices and exchange rates, should not lead to infeasibility problems under different scenarios considered. [Pg.117]

The concepts of equilibrium as the most probable state of a very large system, the size of fluctuations about that most probable state, and entropy (randomness) as a driving force in chemical reactions, are very useful and not that difficult. We develop the Boltzmann distribution and use this concept in a variety of applications. [Pg.228]

At last, we can resolve the paradox between de Broglie waves and classical orbits, which started our discussion of indeterminacy. The indeterminacy principle places a fundamental limit on the precision with which the position and momentum of a particle can be known simultaneously. It has profound significance for how we think about the motion of particles. According to classical physics, the position and momentum are fully known simultaneously indeed, we must know both to describe the classical trajectory of a particle. The indeterminacy principle forces us to abandon the classical concepts of trajectory and orbit. The most detailed information we can possibly know is the statistical spread in position and momentum allowed by the indeterminacy principle. In quantum mechanics, we think not about particle trajectories, but rather about the probability distribution for finding the particle at a specific location. [Pg.140]

Strictly speaking, an independence shown by the probability distributions derived from two states implies an absence of interaction between the systems or subsystems described by these two states however, in practice, the great success of models which combine the concept of independence with a non-negligible interaction shows that except for a relatively small correlation error, on average two systems may show an independent behaviour even if they interact rather strongly . The most obvious example of this is the Hartree-Fock method. [Pg.190]

Some probability distribution functions occur frequently in nature, and have simple mathematical expressions. Two of the most useful ones are the binomial and multinomial distribution functions. These will be the basis for our development of the concept of entropy in Chapter 2. The binomial distribution describes processes in which each independent elementary event has two mutually exclusive outcomes such as heads/tails, yes/no, up/down, or occu-pied/vacant. Independent trials with two such possible outcomes are called Bernoulli trials. Let s label the two possible outcomes 9 and J. Let the probability of be p. Then the probability of J is 1 - p. We choose composite events that are pairs of Bernoulli trials. The probability of followed by is P0j = p(l - p). The probabilities of the four possible composite events are... [Pg.15]

Due to the random and intricate nature of the percolation structures, streamtubes of the liquid flow divide and rejoin repeatedly at each intersection point. This results in a liquid flow distribution which is controlled by a stochastic process. It may thus be analyzed in terms of maximum entropy. This concept is indeed a very elegant route for the estimation of the most probable configuration of a stochastic process. It assumes that this most probable configuration is obtained when the entropy of the process is a maximum. For the case we are interested in, the entropy is essentially configurational. It corresponds to the number of different flow configurations that may be adopted to achieve a liquid flow distribution. [Pg.559]

For both independence and finite variance of the involved random variables, the central limit theorem holds a probability distribution gradually converges to the Gaussian shape. If the conditions of independence and finite variance of the random variables are not satisfied, other limit theorems must be considered. The study of limit theorems uses the concept of the basin of attraction of a probability distribution. All the probability density functions define a functional space. The Gaussian probability function is a fixed point attractor of stochastic processes in that functional space. The set of probability density functions that fulfill the requirements of the central limit theorem with independence and finite variance of random variables constitutes the basin of attraction of the Gaussian distribution. The Gaussian attractor is the most important attractor in the functional space, but other attractors also exist. [Pg.15]


See other pages where Most probable distribution concept is mentioned: [Pg.83]    [Pg.580]    [Pg.16]    [Pg.314]    [Pg.14]    [Pg.83]    [Pg.12]    [Pg.144]    [Pg.27]    [Pg.114]    [Pg.296]    [Pg.36]    [Pg.143]    [Pg.487]    [Pg.101]    [Pg.340]    [Pg.189]    [Pg.610]    [Pg.140]    [Pg.164]    [Pg.210]    [Pg.15]    [Pg.11]    [Pg.163]    [Pg.8]    [Pg.301]    [Pg.762]    [Pg.15]    [Pg.3377]    [Pg.118]    [Pg.323]    [Pg.14]    [Pg.247]    [Pg.250]    [Pg.578]   
See also in sourсe #XX -- [ Pg.40 ]




SEARCH



Most probable

Most probable distribution

Probability concept

Probability distributions

© 2024 chempedia.info