Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Planning of simulation experiments

By definition, simulation models are rather descriptive than normative. Hence, to reveal relevant information for decision makers, simulation experiments have to be conducted. The way how to plan experiments and how to extract concise information from experimental results is subject of the broad field of experimental design. The general aim is to extract as much information as possible about the simulated system with as Uttle computational effort as possible. As both aims are contradictory, general assumptions about the information to be extracted are made and the best way to extract this information w.r.t. the associated computational effort is sought for. This is typically done by assuming a special type of relation between the variables of the studied system constituting a mathematical model (e.g. a linear regression model). [Pg.170]

At the beginning of the planning process, it has to be clarified which questions should be answered by the study. In most cases, the research questions are rather qualitative, unspecific statements than precise algebraic formulations. Hence, such statements have to be operationalized into mathematically manageable terms, l.e. measures have to be defined describing the aspects of the system under study that should be investigated. Such measures are called variables. In the context of experimental studies variables have a lot of attributes depending on their purposes. Table 4.9 shows an overview on the most common attributes of variables. [Pg.170]

Basically, variables are distinguished in response variables and explanatory variables depending on the role defined by the study s questions and hypotheses. Explanatory [Pg.170]

From the underlying questions of a study and their operationalization it follows how variables are defined and which role they can have. When designing and analysing simulation experiments typically multiple response and explanatory variables exist. When [Pg.171]


Most FMSs have some part-buffering capability. This may be not for scheduling reasons, but for technological, that is, process-planning, reasons (e.g., the part must cool before an accurate inspection procedure is performed). Some level of buffering is useful and necessary because of reliability reasons. (The actual number of buffer store locations should be established on the basis of simulation and experience.)... [Pg.171]

Diffusion of the probe in the gas and polymer phases, and adsorption on the support and on the polymer surface (both types of adsorption have nonlinear isotherms), simultaneously play an important role in IGC experiments and must be accounted for properly. An extensive computational program is planned to simulate the individual processes and to assess their influence on chromatographic behavior. In a recent paper, simulated behavior of three types of system was described (9). In the simplest case, only diffusion in the gas phase was operative. This case corresponds to elution of an ideal marker. Simultaneous effects of gaseous diffusion and partitioning of the probe between the phases were simulated next, assuming an instantaneous... [Pg.35]

Process simulation can guide and minimise the experimental research, but not eliminate it. Actually, the calibration of models requires accurate experimental data. It is the experiment that proves the model, and not the opposite Statistical planning of experiments is nowadays in a large extent obsolete. Instead, the experimental research should take profit from the power of rigorous models incorporated in simulation packages, particularly in the field of thermodynamics. For instance, simple vapour-liquid equilibrium (VLE) experiments in laboratory can be used to Increase the reliability of a feasibility study in innovative processes. Conversely, industrial VLE measurements can be used to calibrate the thermodynamic models incorporated in a simulator when experimental information is not available. [Pg.36]

Just as much care is needed in interpreting the results of laboratory as field studies. This is partly because of the difficulty of simulating field conditions and in the past a lack of awareness of factors likely to have significant effects on phosphatase activity. Hopefully, the evidence reviewed here will provide a useful guide to what needs to be considered when planning experiments. A further problem is harder to overcome the fact that many strains maintained in collec-... [Pg.232]

OC curves for standard acceptance-sampling plans are derived under the assumption that the quality of items can be modeled as independent and identically distributed (i.i.d.) Bernoulli random variables. Although this model is often plausible, the quality of items produced by some processes exhibit statistical dependence. The goal of this simulation experiment is to estimate the OC curve for sampling plan (10, 1) when item quality is dependent. [Pg.2471]

The installation of this system has reduced the time necessary for reactor inspections and eliminated the need for man access during these inspections. The computer model can be updated as required with experience or applied to another type of reactor or areas of the reactor. When linked with the proposed teach and repeat system (early 1985) the use of the simulation option of the display system for the pre-planning of future reactor inspections should further reduce the inspection times. [Pg.368]

The Nimrod tools referred to above do not provide statistical analysis, but rather a comparison of simulated data based on a model and experimental data. However, it is planned to introduce to the Nimrod computer grid a statistical form of analysis based on Bayesian probabilities which will be applied to each FT AC experimental data set. In time, major improvements in the reporting of parameters such as EP, IP, a, R and CpL should emerge when the full power of e-science is routinely introduced into the analysis of voltammetric experiments. [Pg.36]

In the planning of a field polymer flood, it would therefore not be necessary to carry out 2-D floods in order to reproduce the recovery mechanism if improved vertical sweep were the objective of the polymer flood. Numerical simulation would simply be used to investigate these mechanisms. However, it is interesting to study experimental results from scaled layered systems since certain issues can be illustrated in particularly instructive ways. For example, the fluid dynamics of viscous slug breakdown or the flow patterns in the placement of viscous polymer slugs can be visualised very directly using such experiments (Sorbie etal, 1990). [Pg.287]


See other pages where Planning of simulation experiments is mentioned: [Pg.39]    [Pg.170]    [Pg.60]    [Pg.60]    [Pg.61]    [Pg.39]    [Pg.170]    [Pg.60]    [Pg.60]    [Pg.61]    [Pg.533]    [Pg.165]    [Pg.1107]    [Pg.44]    [Pg.76]    [Pg.33]    [Pg.625]    [Pg.346]    [Pg.32]    [Pg.205]    [Pg.660]    [Pg.76]    [Pg.670]    [Pg.235]    [Pg.97]    [Pg.313]    [Pg.21]    [Pg.155]    [Pg.342]    [Pg.231]    [Pg.308]    [Pg.346]    [Pg.306]    [Pg.133]    [Pg.170]    [Pg.178]    [Pg.639]    [Pg.300]    [Pg.319]    [Pg.935]    [Pg.147]    [Pg.761]    [Pg.288]    [Pg.344]    [Pg.231]    [Pg.1692]   


SEARCH



Experiment planning

Planning of experiments

© 2024 chempedia.info