Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

State sequence network

The scheduling problem that is considered in this chapter can be stated as follows. Given  [Pg.14]

Majozi, Batch Chemical Process Integration, DOI 10.1007/978-90-481-2588-3 2, Springer Science+Business Media B.V. 2010 [Pg.14]

In order to facilitate understanding of this concept, a literature problem (Ierapetritou and Floudas, 1998) is presented in Fig. 2.2. [Pg.16]

For comparison, the STN and SSN corresponding to this example are shown in Fig. 2.3a, b, respectively. [Pg.16]

In their formulation, Ierapetritou and Floudas (1998), separated task and unit events by assigning corresponding binary variables to tasks, wv (i,p), and units, yv (i,p), respectively. This led to an overall number of binary variables of P(Nt+Nj), where P is the number of time points, whilst N, and Nj are the numbers of tasks [Pg.16]


Very similar to the STN is the state sequence network (SSN) that was proposed by Majozi and Zhu (2001). The fundamental, and perhaps subtle, distinction between the SSN and the STN is that the tasks are not explicitly declared in the SSN, but indirectly inferred by the changes in states. A change from one state to another, which is simply represented by an arc, implies the existence of a task. Consequently, the mathematical formulation that is founded on this recipe representation involves only states and not tasks. The strength of the SSN lies in its ability to utilize information pertaining to tasks and even the capacity of the units in which the tasks are conducted by simply tracking the flow of states within the network. Since this representation and its concomitant mathematical formulation constitute the cornerstone of this textbook, it is presented in detail in the next chapter. [Pg.10]

Fig. 1.10 (a) State task network vs (b) state sequence network for the illustrative example... [Pg.11]

In this chapter, state sequence network (SSN) representation has been presented. Based on this representation, a continuous-time formulation for scheduling of multipurpose batch processes is developed. This representation involves states only, which are characteristic of the units and tasks present in the process. Due to the elimination of tasks and units which are encountered in formulations based on the state task network (STN), the SSN based formulation leads to a much smaller number of binary variables and fewer constraints. This eventually leads to much shorter CPU times as substantiated by both the examples presented in this chapter. This advantage becomes more apparent as the problem size increases. In the second literature example, which involved a multipurpose plant producing two products, this formulation required 40 binary variables and gave a performance index of 1513.35, whilst other continuous-time formulations required between 48 (Ierapetritou and Floudas, 1998) and 147 binary variables (Zhang, 1995). [Pg.37]

The presented mathematical formulation is an extension of the scheduling model proposed by Majozi and Zhu (2001), which uses a state sequence network (SSN) representation. This formulation is based on an uneven discretization of time framework (see Chapter 2, Fig. 2.4) as shown in Fig. 10.1. A time point corresponds to the beginning of a particular task and is not necessarily equidistant from the preceding and the succeeding time points, as it encountered in discrete-time formulations. The... [Pg.222]

Chapter 2 introduces the reader to the basis of all the mathematical techniques presented in this textbook. The mathematical techniques are founded on a recipe representation known as the state sequence network (SSN), which allows the use of states to dominate the analysis thereby reducing the binary dimension. [Pg.291]

State Task Network (STN) representation of operating sequence for binary and multicomponent batch distillation... [Pg.404]

Keywords sequence synthesis, multicomponent, rectification body method, state task network... [Pg.91]

The same separation steps (same feed and product compositions) can occur in different sequences. This can be seen in Fig. 1, where the first and the second sequence have the first separation in common. This property is used in a superstructure to reduce the complexity of the multicomponent systems. The state task network [3] is applied. In this superstructure, every possible composition, which can be attained, is called a state. The states represent the feed, possible intermediate products and products of the separation sequence. [Pg.92]

The application of the state task network requires the specification of linearly independent products. If products are linearly dependent, which in this case means that they could be attained by mixing other products, the state task network formulation is not applicable. This is due to the fact that the tasks would not be independent from the sequence. The main advantage of superstructure is that the number of tasks only grows with the third power of the number of products, compared to the exponential growth of the number of sequences (Fig.l) [2]. [Pg.93]

Given the word and phone sequence we can construct an HMM model network can to recognise just those words. Recognition is obviously performed with perfect accuracy, but in doing the recognition search we also determine the most likely state sequence, and this gives us the phone and word boundaries. Often this operation is called forced alignment. [Pg.479]

Due to the nature of the problem there is not a one to one match between separation tasks and columns that perform a given separation. Even more, a given feasible sequence of separation tasks can be performed by different sequences of thermodynamically equivalent columns. In this paper we present a task based superstructure with intermediate characteristics between the pure State Task Network (STN) (Yeomans and Grossmann, 1999) in which all the separation tasks are explicitly enumerated, and a superstructure in which equipment is previously determined. Figure 2 shows the superstructure for a mixture of 5 components. Although the picture by itself... [Pg.60]

The two conditions stated above do not assure the occurrence of gelation. The final and sufficient condition may be expressed in several ways not unrelated to one another. First, let structural elements be defined in an appropriate manner. These elements may consist of primary molecules or of chains as defined above or they may consist of the structural units themselves. The necessary and sufficient condition for infinite network formation may then be stated as follows The expected number of elements united to a given element selected at random must exceed two. Stated alternatively in a manner which recalls the method used in deriving the critical conditions expressed by Eqs. (7) and (11), the expected number of additional connections for an element known to be joined to a previously established sequence of elements must exceed unity. However the condition is stated, the issue is decided by the frequency of occurrence and functionality of branching units (i.e., units which are joined to more than two other units) in the system, on the one hand, as against terminal chain units (joined to only one unit), on the other. [Pg.361]

The spectrometer supports phase cycling, asynchronous sequence implementation, and parameter-array experiments. Thus, most standard solid-state NMR experiments are feasible, including CPMAS, multiple-pulse H decoupling such as TPPM, 2D experiments, multiple-quantum NMR, and so on. In addition, the focus of development is on its extension of, or modification to, the hardware and/or the software, in the spirit of enabling the users to put their own new ideas into practice. In this paper, several examples of such have been described. They include the compact NMR and MRI systems, active compensation of RF pulse transients, implementation of a network analyzer, dynamic receiver-gain increment,31 and so on. [Pg.391]


See other pages where State sequence network is mentioned: [Pg.11]    [Pg.14]    [Pg.15]    [Pg.15]    [Pg.15]    [Pg.16]    [Pg.99]    [Pg.120]    [Pg.228]    [Pg.229]    [Pg.198]    [Pg.11]    [Pg.14]    [Pg.15]    [Pg.15]    [Pg.15]    [Pg.16]    [Pg.99]    [Pg.120]    [Pg.228]    [Pg.229]    [Pg.198]    [Pg.38]    [Pg.260]    [Pg.168]    [Pg.4]    [Pg.91]    [Pg.24]    [Pg.406]    [Pg.269]    [Pg.239]    [Pg.404]    [Pg.157]    [Pg.82]    [Pg.155]    [Pg.21]    [Pg.207]    [Pg.90]    [Pg.247]    [Pg.150]    [Pg.218]    [Pg.213]    [Pg.173]   
See also in sourсe #XX -- [ Pg.8 , Pg.222 , Pg.228 , Pg.230 ]

See also in sourсe #XX -- [ Pg.198 ]




SEARCH



© 2024 chempedia.info