Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Complexes computer modelling

Complex computer models are now available to assist in defining the optimum anode distribution . [Pg.157]

The Workbook is intended to be self-sufficient for sizing calculations for the more straightforward applications. The emphasis of the Workbook is on the use of simple (yet adequate) equations, suitable for solving with a pocket calculator, rather than on more complex computer models. [Pg.5]

An experiment involving a complex computer model or code may have tens or even hundreds of input variables and, hence, the identification of the more important variables (screening) is often crucial. Methods are described for decomposing a complex input-output relationship into effects. Effects are more easily understood because each is due to only one or a small number of input variables. They can be assessed for importance either visually or via a functional analysis of variance. Effects are estimated from flexible approximations to the input-output relationships of the computer model. This allows complex nonlinear and interaction relationships to be identified. The methodology is demonstrated on a computer model of the relationship between environmental policy and the world economy. [Pg.308]

Key words Complexity, Computational models, In vitro models, Pathways of toxicity, Pharmacodynamics, Pharmacokinetics, Prediction, Omics data integration, Quantitative extrapolation, Toxicity, Variability... [Pg.531]

In parallel with their own research programs, the manufacturers, through the FPP also jointly fimded research to study the atmospheric chemistry of CFCs in order to assess the extent of any risk they might pose. Independent research workers of imiversities and research institutes worldwide were contracted to measure the rates of reactions, which were essential input data for the complex computer models needed to predict the rate of ozone depletion. This value could not be measured directly in the 1970s because the large daily and seasonal fluctuations in stratospheric ozone concentrations swamped the modest depletion expected from CFCs. [Pg.466]

There have been many debates concerning mow much carbon dioxide is needed to cause a measurable change in the temperature of the earth. Complex computer models have been used to calculate the... [Pg.11]

Aside from other fundamental climate cycles stretching over thousands or tens of thousands of years (such as ice ages, believed to be caused in parts by changes in sunspots and therefore beyond man s ability to influence), Earth s climate has been reasonably stable for 10,000 years or so. But this equilibrium is being upset by man-made carbon emissions. The question is how much. Opinions, basic assumptions about the future course of the climate and the amount of expected heat increase, closely related assumptions about global economic development, and faith in the complex computer models that attempt to forecast climate developments vary widely even among the majority of experts who believe that our planet is facing an unprecedented crisis.9... [Pg.4]

Predictions of climatic change have been made using complex computer models, which incorporate many assumptions. The future situation is still by no means clear. [Pg.414]

There are, however, constraints that can be put on the inverse problem to make it somewhat more tractable. Estimates of tissue conductivity, knowledge of how currents flow in biological tissues, and knowledge of the normal biocunent time course can simplify the problem. Complex computer models have been developed to derive medical diagnostic and other electrophysiological data from maps of surface potentials. TTiese are still the subject of research. [Pg.406]

More complex computational models using Monte Carlo methods have attempted to predict bubble size distributions for a combination of breakup and coalescence. These models typically treat bubble coalescence by analogy with the kinetic theory where bubbles are assumed to act as solid particles [18,19]. They use a binary collision rate (probability) and a collision efficiency factor to account for collisions that do not lead to coalescence. Since collision is assumed to be a random process in these models, turbulence of the same scale as the bubbles or smaller would increase collisions and, therefore, also increase the coalescence rate. [Pg.407]

Evidence synthesis is a term used for synthesis of results from diverse sources and covers a wide range of analysis approaches (Sutton and Abrams, 2001). Bayesian Evidence Synthesis (here denoted as BES) is a statistical framework for exphcitly modeling several related and connected sources of data, in which uncertainty in model parameters are incorporated (Jackson et al., 2013). BES can be seen as a complex meta-analysis (Sutton and Abrams, 2001), where complex means to consider multiple effects from an intervention. Classical meta-analyses are usually based on studies that directly have observed the effect of an intervention. A broader view on meta-analyses allows for studies on effects on a lower level which are combined with quantitative modelling to assess the effect of an intervention on a higher level. In this view, a risk assessment can be seen as a meta-analysis (Linkov et al., 2009). Opening up for a quantitative assessment (or complex computer) model to measure effects, makes it possible to synthesize evidence for effects which are difficult, if at all, to empirically observe. In the PVA example, there is for example no possibility of... [Pg.1593]

A BES is formulated as a cost-efficiency problem that uses a quantitative system model to evaluate chosen measures of cost-efficiency. Based on the state of knowledge on parameters, the quantitative system model is simulated to produce a distribution for model outputs. The system model depends on one or several parameters, which relate to phenomenon in the real world (Fig. 3). In order to avoid confusion with other types of models in the framework, we refer to the quantitative system model (or complex computer model) as a simulator. The parameters of the real world may or may not be observed directly. However, their influence on what have been, or can be, observed are modelled. In this way empirical observations contribute to evidence. A strong link between the quantitative system model and the models for informing parameters makes BES a methodological framework that in a transparent and defend-able way enables conclusions to be made based on available evidence to support decision-making. [Pg.1594]

A detailed treatment of the axial power distribution local heat transfer, two-phase mixture dynamics, and coupling with the rest of the reactor coolant system requires the use of complex computer models. Figure 3.2-1 compares the predictions based on Eq. (3.2-1) with code calculations for a Zion station blackout scenario compounded by failure of turbine-driven auxiliary feedwater (the so-called TMLB scenario). As indicated by the comparison, the exponentially decreasing function defined by Equations 3.2-1 and 3.3-2 is a reasonable approximation for the water level in the core region during this stage of the accident. [Pg.304]


See other pages where Complexes computer modelling is mentioned: [Pg.164]    [Pg.248]    [Pg.130]    [Pg.72]    [Pg.56]    [Pg.236]    [Pg.1080]    [Pg.164]    [Pg.164]    [Pg.433]    [Pg.1265]    [Pg.1726]    [Pg.6]    [Pg.41]    [Pg.1158]    [Pg.5]    [Pg.317]    [Pg.545]    [Pg.29]    [Pg.386]    [Pg.14]    [Pg.182]    [Pg.509]    [Pg.197]   


SEARCH



Complex model

Complexation modeling

Complexation models

Complexity models

Computer complexes

Models complexation model

© 2024 chempedia.info