Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Process model parameter verification

The process of field validation and testing of models was presented at the Pellston conference as a systematic analysis of errors (6. In any model calibration, verification or validation effort, the model user is continually faced with the need to analyze and explain differences (i.e., errors, in this discussion) between observed data and model predictions. This requires assessments of the accuracy and validity of observed model input data, parameter values, system representation, and observed output data. Figure 2 schematically compares the model and the natural system with regard to inputs, outputs, and sources of error. Clearly there are possible errors associated with each of the categories noted above, i.e., input, parameters, system representation, output. Differences in each of these categories can have dramatic impacts on the conclusions of the model validation process. [Pg.157]

Physical Models versus Empirical Models In developing a dynamic process model, there are two distinct approaches that can be taken. The first involves models based on first principles, called physical or first principles models, and the second involves empirical models. The conservation laws of mass, energy, and momentum form the basis for developing physical models. The resulting models typically involve sets of differential and algebraic equations that must be solved simultaneously. Empirical models, by contrast, involve postulating the form of a dynamic model, usually as a transfer function, which is discussed below. This transfer function contains a number of parameters that need to be estimated from data. For the development of both physical and empirical models, the most expensive step normally involves verification of their accuracy in predicting plant behavior. [Pg.6]

The process of research in chemical systems is one of developing and testing different models for process behavior. Whether empirical or mechanistic models are involved, the discipline of statistics provides data-based tools for discrimination between competing possible models, parameter estimation, and model verification for use in this enterprise. In the case where empirical models are used, techniques associated with linear regression (linear least squares) are used, whereas in mechanistic modeling contexts nonlinear regression (nonlinear least squares) techniques most often are needed. In either case, the statistical tools are applied most fruitfully in iterative strategies. [Pg.207]

For a process already in operation, there is an alternative approach based on experimental dynamic data obtained from plant tests. The experimental approach is sometimes used when the process is thought to be too complex to model from first principles. More often, however, we use it to find the values of some model parameters that are unknown. Although many of the parameters can be calculated from steady-state plant data, some must be found from dynamic tests (e.g., holdups in nonreactive systems). Additionally, we employ dynamic plant experiments to confirm the predictions of a theoretical mathematical model. Verification is a critical step in a model s development and application. [Pg.545]

Certain techniques for the application of thermodynamics in separation technology are introduced in Chapter 11, for example, the concept of residue curve maps, a general procedure for the choice of suitable solvents for the separation of azeotropic systems, the verification of model parameters prior to process simulation and the identification of separation problems. [Pg.4]

Verification of Model Parameters Prior to Process Simulation... [Pg.492]

In most commercial process simulators, model parameters for pure component properties and binary parameters can be found for a large number of compounds and binary systems. However, the simulator providers repeatedly warn in their software documentations and user manuals that these default parameters should be applied only after careful examination by the company s thermodynamic experts prior to process simulation. For verification of the model parameters again, a large factual data bank like the DDE is the ideal tool. The DDE allows checking all the parameters used for the description of the pure component properties as a function of temperature and of the binary parameters of a multicomponent system by access to the experimental data stored. On the basis of the results for the different pure component properties and phase equilibria, excess enthalpies, activity coefficients at infinite dilution, separation factors, and so on, the experienced chemical engineer can decide whether all the data and parameters are sufficiently reliable for process simulation. [Pg.492]

Verification is the complement of calibration model predictions are compared to field observations that were not used in calibration or fidelity testing. This is usually the second half of split-sample testing procedures, where the universe of data is divided (either in space or time), with a portion of the data used for calibration/fidelity check and the remainder used for verification. In essence, verification is an independent test of how well the model (with its calibrated parameters) is representing the important processes occurring in the natural system. Although field and environmental conditions are often different during the verification step, parameters determined during calibration are not adjusted for verification. [Pg.156]

More complex schemes have been proposed, such as second-order Markov chains with four independent parameters (corresponding to a copolymerization with penultimate effect, that is, an effect of the penultimate member of the growing chain), the nonsymmetric Bernoulli or Markov chains, or even non-Maikov models a few of these will be examined in a later section. Verification of these models calls for the knowledge of the distribution of sequences that become longer, the more complex the proposed mechanism. Considering only Bernoulli and Markov processes it may be said that at the dyad level all models fit the experimental data and hence none can be verified at the triad level the Bernoulli process can be verified or rejected, while all Markov processes fit at the tetrad level the validity of a first-order Markov chain can be confirmed, at the pentad level that of a second-order Maikov chain, and so on (10). [Pg.23]

The design or simulation of FCC units involves numerically solving the above 21 equations and relations (7.25) to (7.45). The solution process will be discussed later. For the simulation of industrial units and the verification of this model for industrial data, the majority of these 21 equations are used to calculate various parameters in the 10 equations numbered (7.29) to (7.38). Specifically, equations (7.33) to (7.35) compute the concentration and temperature profiles in the bubble phase of the reactor and equation (7.38) computes the temperature profile in the regenerator. This leaves the main equations (7.29) to (7.32), (7.36), and (7.38) as six coupled equations in the six state variables xid, X2D, Yrd, d e, and Yqd-... [Pg.439]

Much more work on the experimental verification and quantification of this physical adsorption process has been conducted in our own laboratory for the chloroplatinic acid (HzPtCL, or CPA)/alumina system [3,4]. The Revised Physical Adsorption (RPA) model [4], with which all known sets of Pt/alumina adsorption data can be satisfactorily simulated with no adjustable parameters, is a result of these efforts. The basis of the RPA model is the a priori calculation of adsorption equilibrium constants, seen as the center regime of Fig. 1. [Pg.45]

The comprehensive analysis of physical, chemical, and electrochemical processes occurring in the solid electrolyte gas sensors, allows verifying the adequacy of mathematical models to the real gas sensors. Processing the results of multiple experimental measurements of the gas sensors consists in elucidation of the type of experimental data distribution, evaluation of the parameters of the established distribution, and verification of the adequacy of the mathematical model to the real sensor. [Pg.83]

Experimental Verification of Adsorption Isotherms and Linear Least-Squares Analysis. If gas A is exposed to a very high surface area solid catalyst (i.e., sslOO m /g) in a closed chamber, then a sensitive electronic balance should provide measurements of the increase in catalyst mass at a given gas pressure pa as active sites become occupied. A flow control valve is necessary to maintain constant pressure pa while measurements are made, because adsorption of gas molecules on the catalytic surface will cause a decrease in gas pressure if additional gas is not introduced into the system. Knowledge of the gas density at STP conditions and the additional mass of gas from the flow control valve required to maintain constant pressure pa allows one to calculate the volume of adsorbed gas per initial mass of catalyst, va- Experiments are repeated at different gas pressures. The raw data correspond to pa va pairs that can be modeled via the Langmuir isotherm to extract two important parameters of the adsorption process. [Pg.386]

The term microkinetics is understood to mean the kinetics of a reaction that are not masked by transport phenomena and to refer to a series of reaction steps. For the investigation of intermediary metabolism, idealized conditions are chosen that often do not correspond to the real conditions of engineering processes. This fact makes it difficult to transfer microkinetic data to technical processes. For the purposes of technologically oriented research and the development of a process to technical ripeness, it is often sufficient to know quantitatively how a process runs without necessarily knowing why. (Macrokinetics, however, must be avoided, as they are scale dependent). Mathematical formulations are needed that reproduce the kinetics adequately for the purpose but are as simple and have as few parameters as possible. Today, even when electronic computers greatly reduce the labor of computation, the criterion of simplicity remains important due to the problem of experimental verification. The iterative nature of the process of building an adequate model is an important point that will be considered in greater detail in Sect. 2.4. [Pg.45]


See other pages where Process model parameter verification is mentioned: [Pg.386]    [Pg.62]    [Pg.389]    [Pg.235]    [Pg.166]    [Pg.319]    [Pg.179]    [Pg.393]    [Pg.255]    [Pg.214]    [Pg.627]    [Pg.206]    [Pg.170]    [Pg.669]    [Pg.79]    [Pg.168]    [Pg.236]    [Pg.99]    [Pg.48]    [Pg.398]    [Pg.34]    [Pg.106]    [Pg.276]    [Pg.309]    [Pg.513]    [Pg.96]    [Pg.191]    [Pg.329]    [Pg.387]   
See also in sourсe #XX -- [ Pg.492 ]




SEARCH



Model parameter

Model verification

Parameter verification

Process parameters

Processing parameters

Verification

Verification of Model Parameters Prior to Process Simulation

© 2024 chempedia.info