Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Bayesian probability approach

Bayesian probability theory and methods that are based on fuzzy-set theory. The principles of both theories are explained in Chapter 16 and Chapter 19, respectively. Both approaches have advantages and disadvantages for the use in expert systems and it must be emphasized that none of the methods, developed up to now are satisfactory [7,11]. [Pg.640]

Bayesian probability theory157 can also be applied to the problem of NMR parameter estimation this approach incorporates prior knowledge of the NMR parameters and is particularly useful at short aquisition times158 and when the FID contains few data points.159 Bayesian analysis gives more precise estimates of the NMR parameters than do methods based on the discrete Fourier transform (DFT).160 The amplitudes can be estimated independently of the phase, frequency and decay constants of the resonances.161 For the usual method of quadrature detection, it is appropriate to apply this technique to the two quadrature signals in the time domain.162-164... [Pg.114]

In this paper we presented an approach where fuzzy logic and BBN concepts are combined to estimate human error probability. This combination leads to a fuzzy Bayesian network approach based on the concept o fuzzy number and on extension principles applied to discrete fuzzy probabilities calculation. [Pg.256]

ABSTRACT A residual reUabiUty radices of particular and structural members of existing structures subjected to extreme service and climate actions are considered. Time-dependent structural safety margins of particular members and their modifications as stochastic finite sequences are discussed. The primary and revised instantaneous and long-term survival probabilities of members exposed to one and two extreme action effects are analyzed. The revised survival probabdity prediction of members during their residual service life is based on the concepts of truncated resistance distributions and Bayesian statistical approaches. The calculation of revised reliability indices of members is demonstrated by the munerical example. [Pg.1370]

Note at this junction how Bayesian probability and logical probability approach each other under the boundary conditions of near falsification or near verification. [Pg.48]

In the history of mathematics, uncertainty was approached in the XVlP century by Pascal and Fermat who introduced the notion of probability. However, probabilities do not allow one to process subjective beliefs nor imprecise or vague knowledge, such as in computer modeling of three-dimensional structure. Subjectivity and imprecision were only considered from 1965, when Zadeh, known for his work in systems theory, introduced the notion of fuzzy set. The concept of fuzziness introduces partial membership to classes, admitting intermediary situations between no and full membership. Zadeh s theory of possibility, introduced in 1977, constitutes a framework allowing for the representation of such uncertain concepts of non-probabilistic nature (9). The concept of fuzzy set allows one to consider imprecision and uncertainty in a single formalism and to quantitatively measure the preference of one hypothesis versus another. Note, however, that Bayesian probabilities could have been used instead. [Pg.398]

Temperature and residence time were varied in an initial halffactorial DoE to estimate the fitted reaction parameters. After these four experiments, subsequent experimental conditions were determined by the Bayesian approach, terminating once the Bayesian probability of one of the rate models passed 95%. The results from the fifth and sixth experiment are shown in Figure 4.8, which illustrates that after six experiments, only reaction model I matched with the experimental results, in agreement with the known rate law for the Diels-Alder reaction. Once the best rate law has been found, further experiments are performed to reduce the size of the joint confidence region of the fitted parameters using the d-optimal approach [57]. [Pg.95]

Bayesian probability theory The approach to probability theory which views a probability as a measure of our uncertainty of knowledge rather than as relative frequency of occurrence. [Pg.127]

A major problem in Bayesian probability had been assigning the prior probability. Pioneering work by E. T. Jaynes has shown that there is a unique consistent way to assign prior probabilities, with this approach becoming known as the principle of maximum entropy. This principle states that the prior probabilities, pi, pi, , Pn, that we associate with the mutually exclusive hypotheses Hi, H2,Hn, should be those which maximize... [Pg.131]

S Greenland. Probability logic and probability induction. Epidemiology 9 322-332, 1998. GM Petersen, G Parmigiam, D Thomas. Missense mutations in disease genes A Bayesian approach to evaluate causality. Am J Hum Genet. 62 1516-1524, 1998. [Pg.345]

Classicists believe that probability has a precise value uncertainty is in finding the value. Bayesians believe that probability is not precise but distributed over a range of values from heterogeneities in the database, past histories, construction tolerances, etc. This difference is subtle but changes the two approaches. [Pg.50]

The essence of the LST for one-dimensional lattices resides in the fact that an operator TtN->N+i could be constructed (equation 5.71), mapping iV-block probability functions to [N -f l)-block probabilities in a manner which satisfies the Kolmogorov consistency conditions (equation 5.68). A sequence of repeated applications of this operator allows us to define a set of Bayesian extended probability functions Pm, M > N, and thus a shift-invariant measure on the set of all one-dimensional configurations, F. Unfortunately, a simple generalization of this procedure to lattices with more than one dimension, does not, in general, produce a set of consistent block probability functions. Extensions must instead be made by using some other, approximate, method. We briefly sketch a heuristic outline of one approach below (details are worked out in [guto87b]). [Pg.258]

We take a Bayesian approach to research process modeling, which encourages explicit statements about the prior degree of uncertainty, expressed as a probability distribution over possible outcomes. Simulation that builds in such uncertainty will be of a what-if nature, helping managers to explore different scenarios, to understand problem structure, and to see where the future is likely to be most sensitive to current choices, or indeed where outcomes are relatively indifferent to such choices. This determines where better information could best help improve decisions and how much to invest in internal research (research about process performance, and in particular, prediction reliability) that yields such information. [Pg.267]

Friedman [12] introduced a Bayesian approach the Bayes equation is given in Chapter 16. In the present context, a Bayesian approach can be described as finding a classification rule that minimizes the risk of misclassification, given the prior probabilities of belonging to a given class. These prior probabilities are estimated from the fraction of each class in the pooled sample ... [Pg.221]

In brief, the Bayesian approach uses PDFs of pattern classes to establish class membership. As shown in Fig. 22, feature extraction corresponds to calculation of the a posteriori conditional probability or joint probability using the Bayes formula that expresses the probability that a particular pattern label can be associated with a particular pattern. [Pg.56]

The knowledge required to implement Bayes formula is daunting in that a priori as well as class conditional probabilities must be known. Some reduction in requirements can be accomplished by using joint probability distributions in place of the a priori and class conditional probabilities. Even with this simplification, few interpretation problems are so well posed that the information needed is available. It is possible to employ the Bayesian approach by estimating the unknown probabilities and probability density functions from exemplar patterns that are believed to be representative of the problem under investigation. This approach, however, implies supervised learning where the correct class label for each exemplar is known. The ability to perform data interpretation is determined by the quality of the estimates of the underlying probability distributions. [Pg.57]

An alternative method, which uses the concept of maximum entropy (MaxEnt), appeared to be a formidable improvement in the treatment of diffraction data. This method is based on a Bayesian approach among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution, with the entropy defined as... [Pg.48]

These considerations raise a question how can we determine the optimal value of n and the coefficients i < n in (2.54) and (2.56) Clearly, if the expansion is truncated too early, some terms that contribute importantly to Po(AU) will be lost. On the other hand, terms above some threshold carry no information, and, instead, only add statistical noise to the probability distribution. One solution to this problem is to use physical intuition [40]. Perhaps a better approach is that based on the maximum likelihood (ML) method, in which we determine the maximum number of terms supported by the provided information. For the expansion in (2.54), calculating the number of Gaussian functions, their mean values and variances using ML is a standard problem solved in many textbooks on Bayesian inference [43]. For the expansion in (2.56), the ML solution for n and o, also exists, lust like in the case of the multistate Gaussian model, this equation appears to improve the free energy estimates considerably when P0(AU) is a broad function. [Pg.65]

Model uncertainty can be represented by formulating 2 or more different models to represent alternative hypotheses or viewpoints and then combining the model outputs by assigning weights representing their relative probability or credibility, using either Bayesian and non-Bayesian approaches. [Pg.25]

Probability can be defined as a limiting case of a frequency ratio, and from this view the various rules of probability can be derived. An alternative approach is an axiomatic one that states that there is a quantity called probability associated with events and that it possesses assigned properties. The former is largely the frequentist point of view, the axiomatic approach is shared by Bayesians and non-Bayesians alike. [Pg.74]


See other pages where Bayesian probability approach is mentioned: [Pg.46]    [Pg.46]    [Pg.57]    [Pg.59]    [Pg.69]    [Pg.119]    [Pg.150]    [Pg.234]    [Pg.123]    [Pg.129]    [Pg.297]    [Pg.57]    [Pg.26]    [Pg.791]    [Pg.211]    [Pg.120]    [Pg.2178]    [Pg.100]    [Pg.243]    [Pg.82]    [Pg.279]    [Pg.414]    [Pg.94]    [Pg.2971]    [Pg.130]    [Pg.330]    [Pg.341]    [Pg.58]    [Pg.422]    [Pg.210]   


SEARCH



Bayesian

Bayesians

Probability approaches

© 2024 chempedia.info