Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Historical simulation

Deriving from its historical simulation environment, Aspen presents with AspenTech an operations manager suite which builds up on software investments already present in a company [105], The difference from conventional tools is the real-time performance management, which gives the user direct real-time access to production data obtained by a local process control system. Keywords are scheduling, logistics, capacity utilization and sales. [Pg.574]

The Severe Nuclear Accident Program (SNAP) model runs with input from available operational HIRLAM. At present, the version runs on 20 km horizontal resolution and provides a sufficiently large domain for SNAP. The most important meteorological input is 3D wind, precipitation and temperature fields. The time resolution for meteorological input is 3-h in operational applications, and 1-h -historical simulations. [Pg.148]

Historically, simulations using the microcanonical ensemble were among the earliest ones reported. The algorithm is easy to implement both conceptually... [Pg.92]

The simulation is performed in BEMFlow off-line. Afterwards, the simulation results are visualized with the help of BEMView, which animates the flow conditions in 3D (Fig. 1.20). The simulations have been run with different screw speeds in order to compare the respective flow conditions. The simulation expert, who took advantage of historical simulation data stored in TRAMP, provides his own contributions by recording animations as video clips, categoi izing and annotating them, and storing the annotated videos in the TRAMP database. [Pg.54]

The historical simulation approach uses the historical distribution of returns from the instruments in a portfolio to simulate the portfolio s VaR. VaR is always defined for a certain probability a and time horizon h. Alternatively, we could refer to the 1 - a quantile (or confidence level) of the loss distribution. For instance, we could say that for a particular portfolio the one-day (time period) 5% (a) probability VaR is 100,000, or that the one-day 95% (1 - a) confidence level VaR is 100,000, which would mean that there is a 5% chance that the portfolio will lose 100,000 in one day. [Pg.790]

For calculating tracking error, historical simulations can be used by considering a position of being long the portfolio and short the index. Therefore, the difference in returns between the portfolio and the index is the variable for which the VaR is calculated. To get a tracking error... [Pg.790]

To conclude, historical simulation with all its variants is another method that relies on past data to predict the future. It has problems coping with complex instruments, instruments with no history, and where the number of observations is limited. We look at a method that uses numerous computer simulations to overcome this, the Monte Carlo simulation, in the next section. [Pg.794]

The approach is similar to the historical simulation method, except that it creates the hypothetical changes in prices by random draws from a stochastic process. It consists of simulating various outcomes of a state variable (or more than one in case of multifactor models), whose distribution has to be assumed, and pricing the portfolio with each of the results. A state variable is the factor underlying the price of the asset that we want to estimate. It could be specified as a macroeconomic variable, the short-term interest rate or the stock price, depending on the economic problem. [Pg.794]

Major banks have traditionally nsed historical simulations to assess firmwide portfolio risk, and relied heavily on empirical data and historical distributions. Monte Carlo simulations, despite being computation-... [Pg.801]

However, both methods suffer from the same assumption whether you utilize a historical return distribution (historical simulation) or choose to model an arbitrary distribution (Monte Carlo), the dependence on such a distribution prevailing in the future can be dangerous, if not disastrous. (WorldCom, Enron, need we go on ) The resolution lies in analytics that make no such assumptions, and are flexible in modelling various scenarios or outcomes. [Pg.802]

Pritsker, M. (2006). The Hidden Dangers of Historical Simulation. Journal of Banking Finance, 30(2f. 561-582. [Pg.274]

Computer simulations also constitute an important basis for the development of the molecular theory of fluids. They could be regarded as quasiexpeiimental procedures to obtain datasets that connect the fluid s microscopic parameters (related mainly to the structure of the system and the molecular interactions) to its macroscopic properties (such as equation of state, dynamic coefficients, etc.). In particular, some of the first historical simulations were performed using two-dimensional fluids to test adaptations of commonly used computer simulation methods [14,22] Monte Carlo (MC) and molecular dynamics (MD). In fact, the first reliable simulation results were obtained by Metropolis et al. [315], who applied the MC method to the study of hard-sphere and hard-disk fluids. [Pg.495]

It Is important to know how much each well produces or injects in order to identify productivity or injectivity changes in the wells, the cause of which may then be investigated. Also, for reservoir management purposes (Section 14.0) it is necessary to understand the distribution of volumes of fluids produced from and injected into the field. This data is input to the reservoir simulation model, and is used to check whether the actual performance agrees with the prediction, and to update the historical data in the model. Where actual and predicted results do not agree, an explanation is sought, and may lead to an adjustment of the model (e.g. re-defining pressure boundaries, or volumes of fluid in place). [Pg.221]

All these observations tend to favour the Verlet algoritlnn in one fonn or another, and we look closely at this in the following sections. For historical reasons only, we mention the more general class of predictor-corrector methods which have been optimized for classical mechanics simulations, [40, 4T] further details are available elsewhere [7, 42, 43]. [Pg.2250]

Historically, sequential-modular simulators were developed first. They were also developed primarily ia iadustry. They coatiaue to be widely used. la terms of unit operatioas, each module can be made as simple or complex as needed. New modules can be added as needed. Equation-oriented simulators, on the other hand, are able to handle arbitrary specifications and limitations for the entire process dow sheet more dexibly and conveniendy than sequential-modular simulators, and process optimization can also be carried out with less computer effort. [Pg.74]

Historically azeotropic distillation processes were developed on an individual basis using experimentation to guide the design. The use of residue curve maps as a vehicle to explain the behavior of entire sequences of heterogeneous azeotropic distillation columns as weU as the individual columns that make up the sequence provides a unifying framework for design. This process can be appHed rapidly, and produces an exceUent starting point for detailed simulations and experiments. [Pg.190]

The model contains a surface energy method for parameterizing winds and turbulence near the ground. Its chemical database library has physical properties (seven types, three temperature dependent) for 190 chemical compounds obtained from the DIPPR" database. Physical property data for any of the over 900 chemicals in DIPPR can be incorporated into the model, as needed. The model computes hazard zones and related health consequences. An option is provided to account for the accident frequency and chemical release probability from transportation of hazardous material containers. When coupled with preprocessed historical meteorology and population den.sitie.s, it provides quantitative risk estimates. The model is not capable of simulating dense-gas behavior. [Pg.350]

Unfortunately the address of the gateway in the control computer used for the data transfer was the same as that used to connect to the distributed control system (dcs). As a result data flowed from the simulator through the control computer to the dcs and replaced the current input data by historic data. Some conditions on the plant started to change, but fortunately this was soon noticed by alert operators, and the plant was brought back under control. [Pg.362]

While CAM-6 is somewhat limited in its ability to perform large-scale simulations of physical systems (it is a much less capable system than its follow-on, the CAM-8, for example see discussion below), its fundamental historical importance cannot be overstated. CAM-6 allowed researchers to directly experience, for the first time and in real time, the evolution of CA systems theretofore undertsood only as purely conceptual models. Margolus and Toffoli recall that when Pomeau, one of... [Pg.713]

We review here results of computer simulations of monolayers, with an emphasis on those models that include significant molecular detail to the surfactant molecule. We start with a focus on hydrocarbon chains and simple head groups (typically a COOH group in either the neutral or the ionized state) and a historical focus. A less comprehensive review follows on simulations of surfactants of other types, either nonhydrocarbon chains or different head groups. More detailed descriptions of the general simulation techniques discussed here are available in a book dedicated to simulation techniques, for example, Allen and Tildesley [338] or Frenkel and Smit [339],... [Pg.118]

A. W. King, W. R. Emanuel, and W. M. Post, Projecting future concentrations of atmospheric COi with global carbon cycle models the importance of simulating historical changes. Environmental Management /6 91 (1992). [Pg.138]

A rather crude, but nevertheless efficient and successful, approach is the bond fluctuation model with potentials constructed from atomistic input (Sect. 5). Despite the lattice structure, it has been demonstrated that a rather reasonable description of many static and dynamic properties of dense polymer melts (polyethylene, polycarbonate) can be obtained. If the effective potentials are known, the implementation of the simulation method is rather straightforward, and also the simulation data analysis presents no particular problems. Indeed, a wealth of results has already been obtained, as briefly reviewed in this section. However, even this conceptually rather simple approach of coarse-graining (which historically was also the first to be tried out among the methods described in this article) suffers from severe bottlenecks - the construction of the effective potential is neither unique nor easy, and still suffers from the important defect that it lacks an intermolecular part, thus allowing only simulations at a given constant density. [Pg.153]

Contain a stochastic climate generator capable of simulating daily precipitation and other weather parameters that are similar in amount and statistical variability to historical weather records for the site. [Pg.1064]


See other pages where Historical simulation is mentioned: [Pg.42]    [Pg.781]    [Pg.789]    [Pg.790]    [Pg.790]    [Pg.791]    [Pg.792]    [Pg.792]    [Pg.793]    [Pg.793]    [Pg.820]    [Pg.42]    [Pg.781]    [Pg.789]    [Pg.790]    [Pg.790]    [Pg.791]    [Pg.792]    [Pg.792]    [Pg.793]    [Pg.793]    [Pg.820]    [Pg.281]    [Pg.2239]    [Pg.168]    [Pg.384]    [Pg.1504]    [Pg.175]    [Pg.152]    [Pg.287]    [Pg.162]    [Pg.496]    [Pg.519]    [Pg.530]    [Pg.706]    [Pg.367]    [Pg.42]   
See also in sourсe #XX -- [ Pg.790 , Pg.791 , Pg.792 , Pg.793 ]




SEARCH



Historical simulation advantages

Historical simulation method

© 2024 chempedia.info