Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Nuclear reactor safety, probability

The notion that methods of statistical analysis should be applied to reactor safety standards was first put forward by Siddall of Atomic Energy of Canada Ltd., Chalk River, Ontario in 1959 (57). This early paper is of interest because it invokes the notion of a balance between increased wealth of the community that may be expected to accrue from the advent of nuclear power on the credit side, and risks of injuries and deaths because of the hazards of the nuclear process on the other it goes on to suggest money costs (economic criteria) as the avenue through which to achieve such a balance. The details given in the paper are only generally relevant today, but some of the introductory sentences have a modern sound to them and are worth quoting as an introduction to the basic philosophy of the probability approach to reactor safety. The study of nuclear-reactor safety (i.e., in 1959, some 15 years ago in the life of an industry now only 20 years of age) is in an unsatisfactory state. Some aspects of the problem have received... [Pg.55]

In 1967, E R. Farmer of the United Kingdom proposed that the probabilities as well as consequences of potential accidents need to be estimated to assess the associated risk. Farmer used 1-131 as a surrogate for consequences. By plotting the probability and consequence of each postulated accident, one could distinguish those with high risk from those with low risk. He proposed a boundary line as a criterion for acceptable risk. Farmer s work was a conceptual breakthrough in nuclear reactor safety analysis. Farmer takes full credit as the originator and pioneer of PRA. [Pg.645]

The frequency of fire-induced core melt, calculated by averaging the observed frequency of the Browns Ferry type of fire over the experience of U.S. commercial nuclear power plants, was found to be lE-5 per reactor-year, or about 20% of the total core-melt probability e.slimated in the Reactor Safety Study. Kazarians and Apostolakis (1978) performed the same type of calculations under different assumptions and concluded that the frequency of core melt could be higher by a factor of 10. [Pg.196]

To emphasize the difference in scale, the different classes of problems are here classified as confined problems and open-ended problems. Confined problems are those where the probability and magnitude of the risks can be quantitatively studied and are found to be limited in scope. Reactor safety and nuclear waste disposal are in this category. [Pg.78]

In this paper, we discuss several categories of decay data which have contributed to low-energy nuclear physics, indicate some of the ways they are useful in solving problems in other areas and identify needs for further measurements. Illustrations include half-life and emission-probability data of actinide nuclides important for reactor technology and useful as reference standards for nuclear-data measurements. Decay data of highly neutron-rich fission-product nuclides are important in such diverse areas as astrophysics and reactor-safety research. Some of these data needs and experimental approaches suitable for satisfying them are presented. [Pg.101]

For a nuclear coal gasification plant, the above safety distance relation is probably not practicable. Therefore it has to be guaranteed by respective studies, that the load upon the nuclear reactor building in case of an outdoors explosion is covered by the design limits. [Pg.55]

Sourced from Draft lEC 300-3-8 (Dependability Management), DEF STAN 00-56 Iss 2 Part 2 Table 4 and ACJ25.1309. This guidance was subsequently removed in DEF STAN 00-56 Iss3 and AMC25.1309. Examples obtained from Qemens, P.L. Human Factors and Operator Errors, J.E. Jacobs presentation second edition, Feb 2002. Human error probabilities are also contained in WASH-1400 (NUREG-75/014) Reactor Safety Study—An Assessment of Accident Risks in U.S. Commercial Nuclear Power Plants, 1975. [Pg.341]

Two concurrent evaluations were made for all probable (and a few improbable) fliel configurations that could be identified. One was made by reactor physics personnel using mactor design codes, while the other was made by nuclear criticality safety specialists using criticality safety codes. In addition to evaluating specific fuel configurations, basic data Such as K-infinite vs fuel-to-water ratios were tabulated for different enrichments, fuel particle sizes, water temperatures, boron levels, etc. [Pg.679]

This section contains the description of related engineering and analytical processes that are used generally in nuclear engineering related to the design and operation of nuclear processes. Chapters 19 and 20 describe the safety evaluations that are used for nuclear facilities. Chapter 19 introduces the risk assessment and safety analysis process that is used for nuclear reactors that are licensed in the United States by the Nuclear Regulatory Commission (NRC). This process has evolved from a relatively simple safety analysis used in the 1950s to a detailed risk assessment process that is used today. Chapter 20 describes the process used in the United States by the Department of Energy for safety analysis of its facilities. It is more prescriptive and less probability and risk based than the process used by the NRC. [Pg.635]

Conformance of the design with the criteria was required and achieved except for specific exceptions allowed after careful evaluations by responsible authority. Exceptions authorized will not adversely affect nuclear safety. The exceptions probably will result In a reduction of the potential maxlmimi operating efficiency of the plant. Additional reactor shutdowns will be required in some Instances to ensure reactor safety. [Pg.169]

Government safety regulation traditionally has been reactive and very prescriptive. Most safety regulation was borne out of a specific accident or series of accidents in an industry. Though the famous Reactor Safety Study WASH-1400 for the commercial nuclear power industry was written in 1975, it did anticipate the failure scenarios of the near failure of the reactor core at Three Mile Island in 1979. The scenarios were correct, but the probability of human failure was underestimated. [Pg.8]

You don t need to be reminded of the most recent nuclear accidents, principally Fukushima Daiichi in Japan in 2011. After the Three Mile Island accident in the late 1970s, the U.S. Atomic Energy Commission developed WASH 1400, The Reactor Safety Study. The WASH 1400 report laid the foundation for the use of probabilistic risk assessments (called probabilistic safety assessments in Europe). According to Henley and Kumamoto (1991), probabilistic risk assessment involves studying accident scenarios and numerically rank[ing] them in order of their probability of occurrence, and then assess[ing] their potential consequence to the public. Event trees, fault trees, and other risk-consequence tools are applied in developing and studying these scenarios. These techniques are extremely useful for the engineer but very expensive. The nuclear industry has been the leader in probabilistic safety analyses. [Pg.57]

The Reactor Safety Study was prompted in part by a request from Senator John Pastore for a comprehensive assessment of reactor safety. The AEC s first response to this request was the WASH-1250 report entitled The Reactor Safety Study of Nuclear Power Reactors (Light Water-Cooled) and Related Facilities, which was published in final form in July 1973. However, WASH-1250 did not provide a probabilistic assessment of risk as requested in Senator Pastore s letter. At the time, relevant probabilistic estimates were quite limited in scope and/or highly subjective. For example, in a policy paper dated November 15, 1971, to the commissioners proposing an approach to the preparation of environmental reports, the regulatory staff estimated that the probability of accidents leading to substantial core meltdown was 10 per reactor-year. In retrospect, this was a highly optimistic estimate, but it typifies the degree to which meltdown accidents were considered "not credible."... [Pg.51]

An anticipated transient without scram (ATWS) is defined as an abnormal transient followed by failure of a reactor scram. Since the Super LWR is a simplified LWR, the probability of an ATWS is expected to be on the same order as that of LWRs. An ATWS of the Super LWR is classified as a beyond design basis event (BDBE). A deterministic evaluation of an ATWS is a global requirement because it is a potential safety issue that may lead to core damage under postulated conditions. Also, it is expected that inherent safety characteristics of nuclear reactors, not only reactivity feedback but also reactor system dynamics, can be clearly identified at ATWS conditions due to a scram failure. Therefore, deterministic ATWS analyses are carried out for the Super LWR in Sect. 6.7 as well as the abnormal transients and accidents. [Pg.365]

Risk assessment of nuclear power plants is based on evaluation of core damage frequency (CDF). Thus we consider 1st and 2nd task category. Task category 1 defines all initiating events, which damage the reactor core. Task category 2 is focused to assess initiating events occurrence probability and to assess safety related systems malfunction probability. [Pg.1108]


See other pages where Nuclear reactor safety, probability is mentioned: [Pg.198]    [Pg.3]    [Pg.295]    [Pg.80]    [Pg.397]    [Pg.940]    [Pg.9]    [Pg.56]    [Pg.35]    [Pg.796]    [Pg.326]    [Pg.479]    [Pg.185]    [Pg.323]    [Pg.332]    [Pg.28]    [Pg.389]    [Pg.446]    [Pg.1]    [Pg.220]    [Pg.83]    [Pg.90]    [Pg.279]    [Pg.26]    [Pg.243]    [Pg.1587]    [Pg.141]    [Pg.33]    [Pg.548]    [Pg.65]    [Pg.144]    [Pg.44]    [Pg.6]    [Pg.231]   


SEARCH



Nuclear reactor safety, probability approach

Nuclear reactors

Nuclear safety

Safety reactors

© 2024 chempedia.info