Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Chance node

There are two types of nodes in the decision tree decision nodes (rectangular) and chance nodes (circular). Decision nodes branch into a set of possible actions, while chance nodes branch into all possible results or situations. [Pg.179]

The probabilities of each branch from chance nodes are then estimated and noted on the diagram. [Pg.180]

For chance nodes it is not possible to foretell the outcome, so each result is considered with its corresponding probability. The value of a chance node is the statistical (weighted) average of all its results. [Pg.180]

In the example, the first decision is whether or not to appraise. If one appraises, then there are three possible outcomes represented by the chance node the high, medium, or low STOMP. On the branches from the chance node, the estimated probability of these outcomes in noted (0.33 in each case). The sum of the probabilities on the branches... [Pg.180]

The decision problem is represented by the decision tree in Figure 5, in which open circles represent chance nodes, squares represent decision nodes, and the black circle is a value node. The first decision node is the selection of the sample size n used in the experiment, and c represents the cost per observation. The experiment will generate random data values y that have to be analyzed by an inference method a. The difference between the true state of nature, represented by the fold changes 6 = 9, 9g), and the inference will determine a loss L(-) that is a function of the two decisions n and a, the data, and the experimental costs. There are two choices in this decision problem the optimal sample size and the optimal inference. [Pg.126]

The solutions are found by averaging out and folding back (Raiffa and Schlaifer, 1961), so that we compute the expected loss at the chance nodes (open circles), given everything to the left of the node. We determine the best actions by minimizing the expected loss at the decision nodes. The first decision is the choice of the inference method a and the optimal decision a (or Bayes action) is found by minimizing the expected loss E L(n, 6, y, a, c), where the expectation is with respect to the conditional distribution of 0 given n and y. The expected loss evaluated in the Bayes action a is called the Bayes risk and we denote it by... [Pg.126]

C = [C. .. C ] is a set of chance nodes which represent relevant uncertain factors for decision problem. Chance nodes are represented by circles. [Pg.1242]

Arcs in A have different meanings according to their targets. We can distinguish conditional arcs (into chance and value nodes), those that have as target chance nodes represent probabUis-tic dependencies and informational arcs (into decision nodes) which imply time precedence. [Pg.1242]

Numerical component (or quantitative component) consists in evaluating different links in the graph. Namely, each conditional arc which has as target a chance node C is quantified by a conditional probability distribution of Q in the context of its parents. Such conditional probabilities should respect the probabilistic normalization constraints. Chance nodes represent uncertain variables characterizing decision problem. Each decision alternative may have several consequences according to uncertain variables. The set of consequences is characterized by a utility function. In IDs, consequences are represented by different combinations of value node s parents. Hence, each value node is quantified by a utility function, denoted by f/, in the context of its parents. The definition of the numerical component is in general done by experts and decision makers. [Pg.1242]

Find a chance node i which is a direct predecessor to the value node such that it has no decision node as successor. [Pg.1242]

Find a chance node j which is a direct successor of i such that there is no other directed path between i and j and reverse the arc between i and j. If i has any other successors repeat step 6. [Pg.1242]

Remove the chance node i with the arc reversal transformation (probability table transformation). [Pg.1242]

Decision tree technique A typical decision tree is shown in Fig. 11/4.3.8-1. Here there are two major nodes one is the decision node and the other is the probabilistic or chance node. The figure shows the decision regarding cost versus risk. [Pg.151]

Branch point/chance node Referring to Fig. V/2.1.1-1, it is can be noted that there is a branching point in the event tree. This is usually designated by a circle (not shown) at the end of a branch indicating the occurrence of an unknown event. This is also called chance node. [Pg.309]

Branch A possible event is represented by a line segment, preceded by a branch point ot chance node, that is designated as a branch. It is a subset of the sample pace fot all possible outcomes associated with a random variable. These are represented by thick lines in Fig. V/2.1.1-1 (and Fig. V/2.1.1-2). [Pg.310]

Chance nodes—Accidental situation, Action effect, Structural damage. Geotechnical conditions, and Structural properties,... [Pg.2238]


See other pages where Chance node is mentioned: [Pg.181]    [Pg.420]    [Pg.2188]    [Pg.2188]    [Pg.1242]    [Pg.1242]    [Pg.1243]    [Pg.1244]    [Pg.1244]    [Pg.1244]    [Pg.1244]    [Pg.1244]    [Pg.1244]   
See also in sourсe #XX -- [ Pg.307 , Pg.309 ]




SEARCH



Chance

Nodes

© 2024 chempedia.info