Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Design Code assumptions

P(4) Testing may also be undertaken where the rules for design by calculation given in this EUROCOMP Design Code would lead to uneconomic results. However, the conservative assumptions in the specified calculation models, which are intended to account for unfavourable influences not explicitly considered in the models, shall not be bypassed. [Pg.230]

A proper surface treatment is probably the most important single factor in the process of ensuring the reliability and durability of a bond. Therefore, the importance of the surface treatment prior to bonding cannot be over emphasised. One of the basic design assumptions is that no adhesive failure is encountered. To guarantee that this assumption will be valid on all occasions, all bond surfaces shall be treated according to the EUROCOMP Design Code. [Pg.458]

In the design procedure the lap length is determined according to the assumption that the lap length has no effect on the magnitude of the maximum shear stress, when (Pc)/t>25. Figure 5.37 of the EUROCOMP Design Code has been constructed in accordance with this limit value. [Pg.478]

Design criteria, along with their intents and assumptions, shall be clearly written to document the basis for the overall design of the incineration system. The design codes and standards used for the systems and components shall be clearly documented and shall include addenda as applicable. [Pg.53]

The Pickering A Risk Assessment (PARA) (Ontario Hydro, 1995) is also a level 3 PSA for 1 of the 4 units at Pickering. A difference between PARA and DPSE is that sequences beyond the design basis were modeled using the MAAP-CANDU codes with best estimate assumptions. Other parts of the analysis used licensing-type conservative assumptions. [Pg.406]

The worst operating condition in a common design practice consists of overly conservative assumptions on the hot-channel input. These assumptions must be realistically evaluated in a subchannel analysis by the help of in-core instrumentation measurements. In the early subchannel analysis codes, the core inlet flow conditions and the axial power distribution were preselected off-line, and the most conservative values were used as inputs to the code calculations. In more recent, improved codes, the operating margin is calculated on-line, and the hot-channel power distributions are calculated by using ex-core neutron detector signals for core control. Thus the state parameters (e.g., core power, core inlet temper-... [Pg.431]

The manner in which toxicological knowledge must work together with the knowledge of human behavior, fire dynamics, and chemistry to produce an acceptable level of fire safety is proposed. A hypothetical example illustrates what must be done with adequate accuracy in order to design fire safety to a performance code. The example may give the impression that this can already be done. In fact, each computer code used contains dozens of assumptions, some very crude, so that the accuracy of present predictions are unacceptably low. [Pg.67]

One approach to determine the reliability of geochemical codes is to take well-defined input data and compare the output from several different codes. For comparison of speciation results, Nordstrom et al. (1979) compiled a seawater test case and a river-water test case, i.e., seawater and river-water analyses that were used as input to 14 different codes. TTie results were compared and contrasted, demonstrating that the thermodynamic databases, the number of ion pairs and complexes, the form of the activity coefficients, the assumptions made for redox species, and the assumptions made for equilibrium solubilities of mineral phases were prominent factors in the results. Additional arsenic, selenium, and uranium redox test cases were designed for testing of... [Pg.2318]

The code FORMOS A-P has been developed over a number of years at North Carolina State University [2-4] for the purpose of automating the process of determining the family of near optimum fuel and BP LPs, while taking into account, with a minimum of assumptions, the complexities of the reload design problem. FORMOSA-P couples the stochastic optimization technique of Simulated Annealing (SA) [5] with a computationally efficient neutronics solver based on second-order accurate, nodal generalized perturbation theory (GPT) [6-7] for evaluating core physics characteristics over the cycle. [Pg.207]

In the latter part of the last section we have silently made the assumption that the repulsion integrals will be computed and stored rather than computed as they are needed. This assumption will be continued in what follows but, in designing any codes, it will be kept in mind that the other option may be more valid in different circumstances and the codes should be flexible enough to accommodate both. The only way to ensure this is to push the details of the acquisition of the repulsion integrals out of sight into a primitive segment which will have different implementations for the two basis possibilities compute and store or compute on demand . [Pg.82]

In the permissible stress approach, the loads are specified exactly, the response analysis is carried out on the basis of elastic theory, and the structure is assessed safe, if the calculated stresses are less than the specified permissible stress. There is no separate consideration of system and parameter uncertainty or the nature of the structure, nor the consequences of failure. The loads are specified usually by other codes of practice which recommend, for example in Britain, a mixture of fair average estimates for dead loads in B. S. 648, extreme maximal estimates for imposed loads in C.P. 3 Chapter V Part 1, and statistical estimates for wind load in C.P.3 Chapter V Part 2. The uncertainty is catered for informally by the safe conservative assumptions of the designer s theoretical model and formally by an appropriate choice of loads and permissible stress values. [Pg.62]

Verifiable Results of a combination of analysis techniques and reviews/inspections, including structured code walkthroughs and inspections against the coding standard verifiability criteria static code analysis that shows determinism verification results of the code against requirements (note some developers and certification authorities adopt the approach that if the requirement/design was verified, this implies it is verifiable, and the code was also) - the limitation with this approach is that the inference is made via an assumption about the extensiveness of verification (and is implicit), rather than specific evidence in this regard. [Pg.301]

Several authors postulate that corporate memories have a lifecycle of about 30 years (Ref 10, Ref 11). Designs rely on past experiences, codes, standards and the hke. A problem with codes and standards is that they are strong on the what but not necessarily helpful with the why. With limited understanding, compromises may be made. The boundaries of accepted wisdom are stretched and subtle changes are made that on their own may not amount to much but over time can dangerously accumulate. Flawed assumptions are made. It may take a disaster to cause a re-think. [Pg.238]


See other pages where Design Code assumptions is mentioned: [Pg.365]    [Pg.91]    [Pg.88]    [Pg.500]    [Pg.510]    [Pg.573]    [Pg.58]    [Pg.477]    [Pg.57]    [Pg.91]    [Pg.557]    [Pg.819]    [Pg.1642]    [Pg.3246]    [Pg.46]    [Pg.47]    [Pg.174]    [Pg.357]    [Pg.214]    [Pg.46]    [Pg.4]    [Pg.156]    [Pg.25]    [Pg.767]    [Pg.57]    [Pg.590]    [Pg.329]    [Pg.252]    [Pg.89]    [Pg.96]    [Pg.172]    [Pg.268]    [Pg.208]    [Pg.289]    [Pg.23]    [Pg.14]    [Pg.7]    [Pg.7]   
See also in sourсe #XX -- [ Pg.8 ]




SEARCH



Design assumptions

© 2024 chempedia.info