Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Deterministic rule approaches

A more principled approach is to define a general rule format, a mle processor or engine and a clearly defined set of rules. This at least has the advantage of engine/rule separation meaning that the way the rules are processed and the rules themselves are separate, which helps with system maintenance, optimisation, modularity and e q)ansion to other languages. One common type of rule is the context-sensitive rewrite rule, which takes the form [Pg.84]

If we consider some sentences with an ambiguous token, we can see quite clearly how collocation information can help (from Yarowsky [507]). [Pg.85]

One simple rule based systems works as follows. Firstly, we require a lexicon which gives the orthography for each word. With this, we can easily determine cheek whether two words in the lexicon have the same orthography, and from this compile a list of tokens which have two or more word forms. Next we write by hand a set of rules which fire (i.e. the activate) on the presence of the trigger token, for example  [Pg.85]

At run-time, we then move through the sentence left-to-right examining each token in turn. When we find one that is ambiguous (that is, it appears in our list), we look for trigger tokens at other positions in the sentence which will form a collocation with the current token. [Pg.85]

In longer sentences we may however find several eollocation matches, and so exiting the list on finding the first may result in an incorrect decision. One extension then is to search the entire list and choose the word that gives the most matches. An alternative is to consciously order the list in some way, with the idea that the choices at the top of the list are more definite indicators than those else where. For example, if we find bass followed immediately by guitar, we can be [Pg.85]


We can now take one of two approaches (1) construct a probabilistic CA along lines with the Metropolis Monte Carlo algorithm outlined above (see section 7.1.3.1), or (2) define a deterministic but reversible rule consistent with the microcanonical prescription. As we shall immediately see, however, neither approach yields the expected results. [Pg.359]

In the first chapter several traditional types of physical models were discussed. These models rely on the physical concepts of energies and forces to guide the actions of molecules or other species, and are customarily expressed mathematically in terms of coupled sets of ordinary or partial differential equations. Most traditional models are deterministic in nature— that is, the results of simulations based on these models are completely determined by the force fields employed and the initial conditions of the simulations. In this chapter a very different approach is introduced, one in which the behaviors of the species under investigation are governed not by forces and energies, but by rules. The rules, as we shall see, can be either deterministic or probabilistic, the latter leading to important new insights and possibilities. This new approach relies on the use of cellular automata. [Pg.9]

Traditionally, risk characterization is based on a deterministic approach, meaning that the risk is based on a point estimate, usually the worst-case value for each input variable (worst-case NOAELs, assessment factors, and exposure levels). This worst-case approach is intended to ensure that even the most sensitive part of the population is protected under all conditions, and therefore generally overestimates the health risk. In the case of food allergens, the maximum consumption of a food may be multiplied by the maximum concentration of the allergen in this food. This results in the maximum estimate of the intake of the allergen. If this intake is higher than the lowest threshold observed, a possible reaction to the allergen cannot be ruled out. [Pg.390]

In conventional chemical kinetics, time changes of concentrations are described deterministically by differential equations. Strictly, this approach applies to infinite populations only. It is justified, nevertheless, for most chemical systems of finite population size since uncertainties are limited according to some /N law, where N is the number of molecules involved. In a typical experiment in chemical kinetics N is in the range of 10 or larger, and hence fluctuations are hardly detectable. Moreover, ordinary chemical reactions involve but a few molecular species, each of which is present in a very large number of copies. The converse situation is the rule in molecular evolution the numbers of different polynucleotide sequences that may be interconverted through replication and mutation exceed by far the number of molecules present in any experiment or even the total number of molecules available on earth or in the entire universe. Hence the applicability of conventional kinetics to problems of evolution is a subtle question that has to be considered carefully wherever a deterministic approach is used. We postpone this discussion and study those aspects for which the description by differential equations can be well justified. [Pg.154]

For a conventional facility, the regulation consists in laws applicable to the facility classified for environment and eventually in the SEVESO II European directive, depending on the quantities of dangerous substances included in the facility. Of course, the conception rules of the art have to be applied. The deterministic approach is declined in a less rigorous way in the conventional industry, but it is more or less based on the DID principle and on studies of postulated major accidents. [Pg.161]

After link templates have been defined, operational forward, backward, and correspondence analysis rules can be derived following the TGG approach (see d) of Fig. 6.5). If the link templates are restricted to using only context and non-context nodes connected by edges, this derivation can be performed automatically. If further graph transformation language constructs (such as paths, set-valued nodes, etc.) are used, not all operational rules are deterministic. As non-determinism is not supported by our rule execution approach, in this case the TGG rules have to be postprocessed manually. This is done by the tool builder, as only few domain knowledge is required. Another manual task is necessary, if attribute assignments have been defined that cannot be inverted to match the derived rule s direction. [Pg.617]

Blocks 7 and 8 are used for forming function F (t, x), determining the influence of the external conditions (deterministic or accidental). Deterministic part, as a rule, forms on the basis of the analytical approach the accidental part forms on the experimental bash. [Pg.49]

In the second phase, a less deterministic approach to knowledge implementation was chosen. A set of preliminary rules in the form of "If. . . then. statements was developed, and the set was grouped into categories such as landfills, above-ground contamination, and removal options and issues. The knowledge engineers rewrote the rules using PRL syntax, and these new rules formed the... [Pg.171]

Prominence prediction by deterministic means is actually one of the most successful uses of non-statistical methods in speech synthesis. This can be attributed to a number of factors, for example the fact that the rules often don t interact or the fact that many of the rules are base on semantic features (such that even if we did use a data driven technique we would still have to come up with the semantic taxonomy by hand). Sproat notes [410] that statistical approaches have had only limited success as the issue (especially in compound noun phrases) is really one of breadth and not modelling regardless of how the prominence algorithm actually works, what it requires is a broad and exhaustive list of examples of compound nouns. Few complex generalisations are present (what machine learning algorithms are good at) and once presented with an example, the rules are not difficult to write by hand. [Pg.139]

Table 6.3 illustrates the various definitions of these features as used in the different calculative approaches. The fracture mechanics approach in the USA is contained in the ASME Code, Section XI (ASME, 2010b), Appendix G. Recently there have been risk-informed probabihstic analyses performed (Gamble et al., 2009) that have been reduced to the same form as shown in Eq. 6.7 with values of = 1, j8 = 61 °C and 7=1. These risk-informed values have been included in the 2011 edition of the ASME Code as an alternative to the traditional deterministic method. The risk-informed approach evolved out of the risk-informed development of the US alternative PTS Rule. [Pg.145]

The aim of the deterministic approach should be to address plant behaviour under specific predetermined operational states and accident conditions and to apply a specific set of rules in judging design adequacy. [Pg.34]

Two other approaches treat a spatially distributed system as consisting of a grid or lattice. The cellular automaton technique looks at the numbers of particles, or values of some other variables, in small regions of space that interact by set rules that specify the chemistry. It is a deterministic and essentially macroscopic approach that is especially useful for studying excitable media. Lattice gas automata are mesoscopic (between microscopic and macroscopic). Like their cousins, the cellular automata, they use a fixed grid, but differ in that individual particles can move and react through probabilistic rules, making it possible to study fluctuations. [Pg.140]

In the molecular dynamics approach, one represents the system by a set of particles randomly distributed in space and then solves Newton s equations for the motion of each particle in the system. Particles react according to the rules assumed for the kinetics when they approach each other sufficiently closely. While the particle motion is deterministic, a stochastic aspect is present in the choice of the initial particle positions and velocities. Again, an average over replicate runs will reduce the statistical fluctuations. By necessity, only a very small system over a very short time (picoseconds) can be simulated (Kawezynski and Gorecki, 1992). We will not treat stochastic methods in any further detail here, but, instead. [Pg.141]

In the qualitative case [15], they are applied in a non-deterministic way. Hence, the approach has been extended by associating to each action a quantity , as, for instance, the cost of an action or a benefit associated to a step [16] [6]. This allows us to evaluate each strategy with respect to a certain measure and select the rule to apply according to a certain value. In particular, we consider the result of threat analysis as input values of controlling rules, in such a way to be able to combine them in the most appropriate way for maximizing the result. [Pg.248]

In the past, the preparation of these standards and guidelines was characterized by deterministic approaches developed on the basis of operational experience in conventional plant engineering. Striving to ensure maximum safety, these deterministic approaches, as a rule, started from conservative safety assumptions and were oriented to what is technically feasible and not to safety-related requirements (e.g. leak-rate tests, containment. Castor contamination). [Pg.143]

Until now, the commonly applied approach to the decision-making on nuclear facilities was based on Deterministic Safety Assessment (DSA), where a set of rules and requirements has been defined in order to ensure a high level of safety. This was done by applying the defense in depth principles and adequate criteria for safety margins (IAEA 2009). [Pg.621]

In France, the reference law for seismic hazard studies is the RFS 2001-01. It provides the details describing the methodology specifically for seismic hazard analysis. The RFS 2001-01 is based on the deterministic approach, the most commonly used methodology in the seventies and eighties. The rule is based on a definition of the characteristics of Maximum Historically Probable Earthquakes considered to be the most penalizing earthquakes liable to occur over a period comparable to the historical period, or about 1,000 years. Secondly, it defines the Safe Shutdown Earthquakes . In the last few years the probabilistic approach has been used and accepted for the reevaluations of the seismic hazard of existing sites. [Pg.214]


See other pages where Deterministic rule approaches is mentioned: [Pg.84]    [Pg.84]    [Pg.425]    [Pg.551]    [Pg.88]    [Pg.401]    [Pg.110]    [Pg.113]    [Pg.325]    [Pg.54]    [Pg.357]    [Pg.160]    [Pg.556]    [Pg.170]    [Pg.117]    [Pg.44]    [Pg.67]    [Pg.66]    [Pg.474]    [Pg.474]    [Pg.529]    [Pg.258]    [Pg.793]    [Pg.99]    [Pg.517]    [Pg.2062]    [Pg.140]    [Pg.161]    [Pg.1582]   


SEARCH



Approach deterministic

Deterministic

Deterministic rules

© 2024 chempedia.info