Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Simulation reliable

In the above simulation, only production of CD4 molecules at the divertor plates is considered, although higher hydrocarbon molecules such as C2D are also produced by chemical sputtering at the actual divertor plates, and they can also contribute to the CD band emission [17]. The reason for the discrepancy between the observed CD band intensity and the simulated one might be that only CD4 production is considered in the simulation. Reliable data for all relevant hydrocarbon molecules are required for more accurate simulations. [Pg.128]

This paper presents the results of a study to investigate and establish the reliability of both the description and performance data estimates from numerical reservoir simulators. Using optimal control theory, an algorithm was developed to perform automated matching of field observed data and reservoir simulator calculated data, thereby estimating reservoir parameters such as permeability and porosity. Well known statistical and probability methods were then used to establish individual confidence limits as well as joint confidence regions for the parameter estimates and the simulator predicted performance data. The results indicated that some reservoir input data can be reliably estimated from numerical reservoir simulators. Reliability was found to be inversely related to the number of unknown parameters in the model and the level of measurement error in the matched field observed data. [Pg.57]

Li, Y.-F., Zio, E. Lin, Y.-H. 2012. A Multistate Physics Model of Component Degradation Based on Stochastic Petri Nets and Simulation. Reliability, IEEE Transactions on, 61, 921—931. [Pg.782]

Kleyner, A. Sandborn, P. 2004. A warranty forecasting model based on piecewise statistical distribution and stochastic simulation. Reliability Engineering and System Safety. 88(3) 207-214. [Pg.1876]

In many cases faults will only restrict fluid flow, or they may be open i.e. non-sealing. Despite considerable efforts to predict the probability of fault sealing potential, a reliable method to do so has not yet emerged. Fault seal modelling is further complicated by the fact that some faults may leak fluids or pressures at a very small rate, thus effectively acting as seal on a production time scale of only a couple of years. As a result, the simulation of reservoir behaviour in densely faulted fields is difficult and predictions should be regarded as crude approximations only. [Pg.84]

The amount of detail input, and the type of simulation model depend upon the issues to be investigated, and the amount of data available. At the exploration and appraisal stage it would be unusual to create a simulation model, since the lack of data make simpler methods cheaper and as reliable. Simulation models are typically constructed at the field development planning stage of a field life, and are continually updated and increased in detail as more information becomes available. [Pg.206]

The most reliable way of generating production profiles, and investigating the sensitivity to well location, perforation interval, surface facilities constraints, etc., is through reservoir simulation. [Pg.209]

A systematic comparison of two sets of data requires a numerical evaluation of their likeliness. TOF-SARS and SARIS produce one- and two-dhnensional data plots, respectively. Comparison of sunulated and experimental data is accomplished by calculating a one- or two-dimensional reliability (R) factor [33], respectively, based on the R-factors developed for FEED [34]. The R-factor between tire experimental and simulated data is minimized by means of a multiparameter simplex method [33]. [Pg.1812]

Near critical points, special care must be taken, because the inequality L will almost certainly not be satisfied also, cridcal slowing down will be observed. In these circumstances a quantitative investigation of finite size effects and correlation times, with some consideration of the appropriate scaling laws, must be undertaken. Examples of this will be seen later one of the most encouraging developments of recent years has been the establishment of reliable and systematic methods of studying critical phenomena by simulation. [Pg.2242]

The examples discussed in tliis chapter show a strong synergy between fundamental physical chemistry and device processing metliods. This is expected only to become richer as shrinking dimensions place ever more stringent demands on process reliability. Selecting key aspects of processes for fundamental study in simpler environments will not only enable finer control over processes, but also enable more sophisticated simulations tliat will reduce tire cost and time required for process optimization. [Pg.2939]

While simulations reach into larger time spans, the inaccuracies of force fields become more apparent on the one hand properties based on free energies, which were never used for parametrization, are computed more accurately and discrepancies show up on the other hand longer simulations, particularly of proteins, show more subtle discrepancies that only appear after nanoseconds. Thus force fields are under constant revision as far as their parameters are concerned, and this process will continue. Unfortunately the form of the potentials is hardly considered and the refinement leads to an increasing number of distinct atom types with a proliferating number of parameters and a severe detoriation of transferability. The increased use of quantum mechanics to derive potentials will not really improve this situation ab initio quantum mechanics is not reliable enough on the level of kT, and on-the-fly use of quantum methods to derive forces, as in the Car-Parrinello method, is not likely to be applicable to very large systems in the foreseeable future. [Pg.8]

Summarizing, from a mathematical point of view, both forward and backward analysis lead to the insight that long term trajectory simulation should be avoided even with symplectic discretizations. Rather, in the spirit of multiple as opposed to single shooting (cf. Bulirsch [4, 18]), only short term trajectories should be used to obtain reliable information. [Pg.101]

For long term simulations, it turns out that the reproduction of the conservation properties is most important in order to ensure reliable results. [Pg.399]

Kurst G R, R A Stephens and R W Phippen 1990. Computer Simulation Studies of Anisotropic iystems XIX. Mesophases Formed by the Gay-Berne Model Mesogen. Liquid Crystals 8 451-464. e F J, F Has and M Orozco 1990. Comparative Study of the Molecular Electrostatic Potential Ibtained from Different Wavefunctions - Reliability of the Semi-Empirical MNDO Wavefunction. oumal of Computational Chemistry 11 416-430. [Pg.268]

Polymer simulations can be mapped onto the Flory-Huggins lattice model. For this purpose, DPD can be considered an off-lattice version of the Flory-Huggins simulation. It uses a Flory-Huggins x (chi) parameter. The best way to obtain % is from vapor pressure data. Molecular modeling can be used to determine x, but it is less reliable. In order to run a simulation, a bead size for each bead type and a x parameter for each pair of beads must be known. [Pg.274]

Material properties can be further classified into fundamental properties and derived properties. Fundamental properties are a direct consequence of the molecular structure, such as van der Waals volume, cohesive energy, and heat capacity. Derived properties are not readily identified with a certain aspect of molecular structure. Glass transition temperature, density, solubility, and bulk modulus would be considered derived properties. The way in which fundamental properties are obtained from a simulation is often readily apparent. The way in which derived properties are computed is often an empirically determined combination of fundamental properties. Such empirical methods can give more erratic results, reliable for one class of compounds but not for another. [Pg.311]

The most obvious feature of these C chemical shifts is that the closer the carbon is to the electronegative chlorine the more deshielded it is Peak assignments will not always be this easy but the correspondence with electronegativity is so pronounced that spec trum simulators are available that allow reliable prediction of chemical shifts from structural formulas These simulators are based on arithmetic formulas that combine experimentally derived chemical shift increments for the various structural units within a molecule... [Pg.550]

In general, the longer the simulation, the more reliable the calculated properties. You can also average the data from several simulations. [Pg.76]

Production. When the simulation is equilibrated, the dynamic simulation is considered reliable. Prom this point on, the trajectory generated is stored for further analysis. Typical production runs take from several hundred picoseconds up to tens of nanoseconds (depending on the size of the system and the available computer power). [Pg.51]


See other pages where Simulation reliable is mentioned: [Pg.14]    [Pg.1847]    [Pg.192]    [Pg.275]    [Pg.14]    [Pg.1847]    [Pg.192]    [Pg.275]    [Pg.412]    [Pg.413]    [Pg.39]    [Pg.1477]    [Pg.1821]    [Pg.2645]    [Pg.2645]    [Pg.526]    [Pg.3]    [Pg.16]    [Pg.27]    [Pg.172]    [Pg.230]    [Pg.246]    [Pg.349]    [Pg.372]    [Pg.438]    [Pg.593]    [Pg.594]    [Pg.313]    [Pg.314]    [Pg.76]    [Pg.97]    [Pg.1319]    [Pg.2077]    [Pg.170]    [Pg.173]    [Pg.195]   
See also in sourсe #XX -- [ Pg.63 ]




SEARCH



Simulation tool reliability

© 2024 chempedia.info