Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Conditional probabilities

The conditional probability of event B given A is denoted by P(B A) and defined as [Pg.548]

For example, consider Uie random experiment of drawing two cards in succession from a deck of 52 cards. Suppose Uie cards are drawn wiUiout replacement (i.e., Uie first card drawn is not replaced before the second is drawn). Let A denote Uie event Uiat Uie first card is an ace and B Uie event Uiat Uie second card is an ace. The sample space S can be described as a set of 52 times 51 pairs of cards. Assuming Uiat each of these (52)(51) pairs lias Uie same Uieoretical relative frequency, assign probability 1/(52)(51) to each pair. The number of pairs featuring an ace as Uie first and second card is (4)(3). Therefore, [Pg.548]

The number of pairs featuring an ace as Uie first card and one of Uie other 51 cards as Uie second is (4)(51). Therefore, [Pg.548]

Conditional probability also can be used to formulate a definition for the independence of two events A and B. Event B is defined to be independent of event A if and only if [Pg.549]

Similarly, event A is defined to be independent of event B if and only if [Pg.549]

Applying tlie definition of conditional probability Eq. (19.5.1) yields [Pg.548]

Conditional probability also can be used to formulate a definidon for [Pg.549]

The probability of an event occurring given that some event has already happened is called the conditional probability  [Pg.545]

This is simple to illustrate with an example. From the sun and rain example What is the probability that it will rain on the second day given that it rained on the first day From the above equation and using the joint and marginal distributions. [Pg.545]


The main point of our elaboration is, that the Gibbs measure (4) of the potential lattice under interest ctin be considered as a nontrivial prior in the Bayes formula for the conditional probability, applied to the problem of image restoration ... [Pg.114]

In the Maximum Entropy Method (MEM) which proceeds the maximization of the conditional probability P(fl p ) (6) yielding the most probable solution, the probability P(p) introducing the a priory knowledge is issued from so called ergodic situations in many applications for image restoration [1]. That means, that the a priori probabilities of all microscopic configurations p are all the same. It yields to the well known form of the functional 5(/2 ) [9] ... [Pg.115]

Flere g(r) = G(r) + 1 is called a radial distribution function, since n g(r) is the conditional probability that a particle will be found at fif there is another at tire origin. For strongly interacting systems, one can also introduce the potential of the mean force w(r) tln-ough the relation g(r) = exp(-pm(r)). Both g(r) and w(r) are also functions of temperature T and density n... [Pg.422]

The effective field is detennined by assuming that the conditional probabilities are the same, i.e. [Pg.518]

By expressing the mean-field interaction of an electron at r with the N- 1 other electrons in temis of a probability density pyy r ) that is independent of the fact that another electron resides at r, the mean-field models ignore spatial correlations among the electrons. In reality, as shown in figure B3.T5 the conditional probability density for finding one ofA - 1 electrons at r, given that one electron is at r depends on r. The absence of a spatial correlation is a direct consequence of the spin-orbital product nature of the mean-field wavefiinctions... [Pg.2163]

To exemplify both aspects of the formalism and for illustration purposes, we divide the present manuscript into two major parts. We start with calculations of trajectories using approximate solution of atomically detailed equations (approach B). We then proceed to derive the equations for the conditional probability from which a rate constant can be extracted. We end with a simple numerical example of trajectory optimization. More complex problems are (and will be) discussed elsewhere [7]. [Pg.264]

We recently received a preprint from Dellago et al. [9] that proposed an algorithm for path sampling, which is based on the Langevin equation (and is therefore in the spirit of approach (A) [8]). They further derive formulas to compute rate constants that are based on correlation functions. Their method of computing rate constants is an alternative approach to the formula for the state conditional probability derived in the present manuscript. [Pg.265]

A related algorithm can be written also for the Brownian trajectory [10]. However, the essential difference between an algorithm for a Brownian trajectory and equation (4) is that the Brownian algorithm is not deterministic. Due to the existence of the random force, we cannot be satisfied with a single trajectory, even with pre-specified coordinates (and velocities, if relevant). It is necessary to generate an ensemble of trajectories (sampled with different values of the random force) to obtain a complete picture. Instead of working with an ensemble of trajectories we prefer to work with the conditional probability. I.e., we ask what is the probability that a trajectory being at... [Pg.266]

The definition of the above conditional probability for the case of Brownian trajectories can be found in textbooks [12], However, the definition of the conditional probability for the Newton s equations of motion is subtler than that. [Pg.268]

Consider a numerical solution of the Newton s differential equation with a finite time step - At. In principle, since the Newton s equations of motion are deterministic the conditional probability should be a delta function... [Pg.268]

The initial energy - E XoA t), VoA(t)) - is a function of the coordinates and the velocities. In principle, the use of momenta (instead of velocities) is more precise, however, we are using only Cartesian coordinates, making the two interchangeable. We need to sample many paths to compute ensemble averages. Perhaps the most direct observable that can be computed (and measured experimentally) is the state conditional probability - P A B,t) defined below ... [Pg.275]

This is the conditional probability that the system which was in state A at time zero will be in state B at time t. Note that we use the normalized conditional probability since the trajectory must end either at A or at B. [Pg.275]

The above phenomenological equations are assumed to hold in our system as well (after appropriate averaging). Below we derive formulas for P[Aq B, t), which start from a microscopic model and therefore makes it possible to compare the same quantity with the above phenomenological equa tioii. We also note that the formulas below are, in principle, exact. Therefore tests of the existence of a rate constant and the validity of the above model can be made. We rewrite the state conditional probability with the help of a step function - Hb(X). Hb X) is zero when X is in A and is one when X is ill B. [Pg.277]

The state conditional probability is therefore written in terms of the computable, average step function ... [Pg.277]

A single calculation of the discrete path integral with a fixed length of time t can be employed to compute the state conditional probability at many other times. It is possible to use segments of the path of time length At, 2At,..., NAt sampled in trajectories of total length of NAt and to compute the corresponding state conditional probabilities. The result of the calculations will make it possible to explore the exponential relaxation of P Ao B,t) for times between 0 and t. [Pg.278]

If the above assumption is reasonable, then the modeling of most probable trajectories and of ensembles of trajectories is possible. We further discussed the calculations of the state conditional probability and the connection of the conditional probability to rate constants and phenomenological models. [Pg.279]

It is known that tr-allylpalladium acetate is converted into allyl acetate by reductive elimination when it is treated with CO[242,243]. For this reason, the carbonylation of allylic acetates themselves is difficult. The allylic acetate 386 is carbonylated in the presence of NaBr (20-50 mol%) under severe conditions, probably via allylic bromides[244]. However, the carbonylation of 5-phenyl-2,4-pentadienyl acetate (387) was carried out in the presence of EtiN without using NaBr at 100 °C to yield methyl 6-phenyl-3,5-hexadienoate (388)[245J. The dicarbonylation of l,4-diacetoxy-2-butene to form the 3-hexenedioate also proceeds by using tetrabutylphosphonium chloride as a ligand in 49% yield[246]. [Pg.341]

If the events are not independent, provision must be made for this, so we define a quantity called the conditional probability. For the probability of a head given the prior event of a head, this is written Ph/h> where the first quantity in the subscript is the event under consideration and that following the slash mark is the prior condition. Thus Pj h probability of a tail following... [Pg.454]

If the coin is biased, conditional probabilities must be introduced Phhh Ph/hhPh/hPh... [Pg.455]

A similar logic can be applied to copolymers. The story is a bit more complicated to tell, so we only outline the method. If penultimate effects operate, then the probabilities Ph, Pi2> and so on, defined by Eqs. (7.32)-(7.35) should be replaced by conditional probabilities. As a matter of fact, the kind of conditional probabilities needed must be based on the two preceding events. Thus reactions (7.E) and (7.F) are two of the appropriate reactions, and the corresponding probabilities are Pj n and V i2 - Rather than work out all of the possibilities in detail, we summarize the penultimate model as follows ... [Pg.455]

The probability Pjj as given by Eq. (7.32) is replaced by the conditional probability Piyu, which is defined as... [Pg.456]

There are eight of these conditional probabilities, each associated with the reactions described in item (1). [Pg.456]

A mechanism in which the stereochemistry of the growing chain does exert an influence on the addition might exist, but at least two repeat units in the chain are required to define any such stereochemistry. Therefore this possibility is equivalent to the penultimate mechanism in copolymers. In this case the addition would be described in terms of conditional probabilities, just as Eq. (7.49) does for copolymers. Thus the probability of an isotactic triad controlled by the stereochemistry of the growing chain would be represented by the reaction... [Pg.479]

Heat resistance and gas corrosion resistance depends on chemical, phase compositions and stmcture of an alloy. The local corrosion destmction (LCD) of heat resisting alloys (HRS), especially a cast condition, probably, is determined by sweat of alloying elements. [Pg.437]

Likelihood A measure of the expected frequency with which an event occurs. This may be expressed as a frequency (e.g., events per year), a probability of occurrence during a time interval (e.g., annual probability), or a conditional probability (e.g., probability of occurrence, given that a precursor event has occurred). [Pg.163]

Pratt [43] made the innovative suggestion that transition pathways could be determined by maximizing the cumulative transition probability connecting the known reactant and product states. That is, the most probable transition pathways would be expected to be those with the largest conditional probability. [Pg.213]

Figure 4 Sample spatial restraint m Modeller. A restraint on a given C -C , distance, d, is expressed as a conditional probability density function that depends on two other equivalent distances (d = 17.0 and d" = 23.5) p(dld, d"). The restraint (continuous line) is obtained by least-squares fitting a sum of two Gaussian functions to the histogram, which in turn is derived from many triple alignments of protein structures. In practice, more complicated restraints are used that depend on additional information such as similarity between the proteins, solvent accessibility, and distance from a gap m the alignment. Figure 4 Sample spatial restraint m Modeller. A restraint on a given C -C , distance, d, is expressed as a conditional probability density function that depends on two other equivalent distances (d = 17.0 and d" = 23.5) p(dld, d"). The restraint (continuous line) is obtained by least-squares fitting a sum of two Gaussian functions to the histogram, which in turn is derived from many triple alignments of protein structures. In practice, more complicated restraints are used that depend on additional information such as similarity between the proteins, solvent accessibility, and distance from a gap m the alignment.
The first rule states that the probability of A plus the probability of not-A (A) is equal to 1. The second rule states that the probability for the occurrence of two events is related to the probability of one of the events occurring multiplied by the conditional probability of the other event given the occurrence of the first event. We can drop the notation of conditioning on I as long as it is understood implicitly that all probabilities are conditional on the information we possess about the system. Dropping the /, we have the usual expression of Bayes rule. [Pg.315]

GE Arnold, AK Dunker, SJ Johns, RJ Douthart. Use of conditional probabilities for determining relationships between ammo acid sequence and protein secondary structure. Proteins 12 382-399, 1992. [Pg.348]

Pof = conditional probability that, given there is a fault, the failure will actually occur. [Pg.67]

Consider an experiment that in N trials, evenLs A and B occur together N(A B) times, and event B occurs N(B) times. The conditional probability of A given B is equation 2.4-i or 2.4-2. Rearranging, results in equation 2.4-3. [Pg.41]

If the failure distribution of a component i.s exponential, the conditional probability of observing exactly M failures in test time t given a true (but unknown) failure rate A and a Poisson distribution, is equation 2.6-9. The continuous form of Bayes s equation is equation... [Pg.52]

Waller (NUREG/CR-4314) provides a concise review of USC with 143 references and relevant papers. He quotes Evans (1975) that no new theory is needed to analyze. system dependencies. The problem persists because the conditional probabilities are not known. Except for the bounding method used in WASH-1400, the other two methods presented below can be shown to be derivable from the theory of Marshall and Olkin (1967). Waller reviews methods other than the three presented here. They are not presented for absence of physical insight to the problem. [Pg.125]


See other pages where Conditional probabilities is mentioned: [Pg.115]    [Pg.330]    [Pg.518]    [Pg.693]    [Pg.1541]    [Pg.267]    [Pg.267]    [Pg.270]    [Pg.274]    [Pg.275]    [Pg.277]    [Pg.479]    [Pg.475]    [Pg.61]    [Pg.283]    [Pg.69]    [Pg.46]   
See also in sourсe #XX -- [ Pg.21 , Pg.33 ]

See also in sourсe #XX -- [ Pg.23 ]

See also in sourсe #XX -- [ Pg.236 , Pg.251 ]

See also in sourсe #XX -- [ Pg.10 , Pg.62 ]

See also in sourсe #XX -- [ Pg.86 ]

See also in sourсe #XX -- [ Pg.23 ]

See also in sourсe #XX -- [ Pg.548 , Pg.549 , Pg.610 , Pg.611 , Pg.612 ]

See also in sourсe #XX -- [ Pg.18 ]

See also in sourсe #XX -- [ Pg.193 , Pg.194 , Pg.201 ]

See also in sourсe #XX -- [ Pg.548 , Pg.549 , Pg.610 , Pg.611 , Pg.612 ]

See also in sourсe #XX -- [ Pg.67 , Pg.78 , Pg.80 , Pg.105 , Pg.112 , Pg.114 , Pg.157 ]

See also in sourсe #XX -- [ Pg.58 ]

See also in sourсe #XX -- [ Pg.33 , Pg.34 ]

See also in sourсe #XX -- [ Pg.134 ]

See also in sourсe #XX -- [ Pg.163 , Pg.460 ]

See also in sourсe #XX -- [ Pg.244 ]

See also in sourсe #XX -- [ Pg.12 ]

See also in sourсe #XX -- [ Pg.360 , Pg.490 ]

See also in sourсe #XX -- [ Pg.408 , Pg.409 ]

See also in sourсe #XX -- [ Pg.545 ]

See also in sourсe #XX -- [ Pg.545 ]

See also in sourсe #XX -- [ Pg.250 ]

See also in sourсe #XX -- [ Pg.210 ]

See also in sourсe #XX -- [ Pg.7 ]

See also in sourсe #XX -- [ Pg.58 ]

See also in sourсe #XX -- [ Pg.229 ]

See also in sourсe #XX -- [ Pg.6 ]

See also in sourсe #XX -- [ Pg.321 ]




SEARCH



Amino acids conditional probabilities

Boundary conditions, probability distributions

Condensed conditional probabilities

Conditional Probability and Independence

Conditional bond state probability

Conditional density probability

Conditional probabilities, fluctuation theorem

Conditional probability computational procedure

Conditional probability distribution

Conditional probability distributions INDEX

Conditional probability theory

Conditional probability, compartmentalized

Conditional probability/Bayes’ theorem

Conditional transition probabilities

Detailed balance condition and acceptance probability

FIGURE 5.2 Venn diagram illustrating the development of conditional probability

Phase-space conditional probability density

Phase-space conditional probability density function

Probabilistic fracture mechanics conditional probability

Probability Conditional event

Probability function conditional

Probability kernel, conditional

Probability theory conditional expectation

Probability waves conditions

Probability-density functions conditioned

Radiation boundary conditions, probability

Schrodingers Conditional Probability Density

Stochastic process conditional probability distribution

Suspensions conditional probability density

The conditional probability

© 2024 chempedia.info