Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Long-term averaging

ISCLT3 - Industrial Source Complex - Long Term The ISC3 Long Term dispersion model is used to model emissions with long-term averaging periods. Click the filename to download the file. You will see the following codes to download ... [Pg.329]

Hundreds of computations are needed to generate the long-term average ambient concentration at various receptor sites due to one or more continuously discharging stacks. [Pg.359]

There are severe limits on the recoverable paleoclimate information by this method (MacAyeal et ah, 1993). Reconstructions are only possible for long-term average temperatures and are restricted to the last glacial period and more recent times. [Pg.474]

The power of this technique is due to the fact that the temperature-depth profile is a direct remnant of paleotemperatures at the ice-sheet surface. It provides a quantitatively accurate measure of long-term average temperatures. This allows the stable isotope records to be calibrated for major climate events (Cuffey et ah, 1995). [Pg.474]

Water scarcity is defined as a situation where insufficient water resources are available to satisfy long-term average requirements. It refers to long-term water imbalances, where the availability is low compared to the demand for water, and means that water demand exceeds the water resources exploitable under sustainable conditions. [Pg.130]

The Leggett (1992) model was developed to predict tissue doses and whole-body dose to people who may be exposed to americium. The model is considered an updated version of the ICRP (1989) model for americium, which has been used to establish risk-based limits of intake of241 Am (ICRP 1989). The Leggett (1992) and ICRP (1989) models predict similar long-term average doses of americium to the liver and skeleton for an injection exposure and would be expected to predict similar radiation risks and risk-based intake limits (Leggett 1992). Descriptions of applications of the Leggett (1992) model in risk assessment have not been reported. [Pg.97]

The additional wood demand of 17.1 Mio m3 corresponds to 27% of the total German wood harvest in 2006 (62.3 Mio m3). The wood harvest of 2007 of 76.7 Mio m3 cannot be used as typical reference due to the windfall of the storm Kyrill . The harvested wood volume in 2006 has already doubled in comparison to the long-term average of the 1990s (approximately 34 Mio m3) [9],... [Pg.403]

We have alluded above to the fact that dietary reconstruction from bone can be no more than a relatively long-term average, since in life bone is constantly remodelled. In general, a dietary reconstruction based on bone collagen is likely to represent the average diet of that individual over the last few years of life -perhaps up to as much as ten years before death, depending on the particular bone used. An extension of the isotopic dietary method is to use the differential information available within a single skeleton to study human lifetime mobility. This technique has been developed and exploited most clearly on historic material from South Africa (Sealy et al., 1995 Sealy, 2001 Cox et al., 2001). [Pg.366]

Special tubes are available that can be used for long-term average concentration, rather than instantaneous. [Pg.78]

The long-term average total extinction coefficient at downtown Los Angeles is 6.62 10 m (19) as inferred from daytime measurements of gjevailing visibility. Elemental carbon present at about 9 pgm with a specific absorption of 11.9 m g would account for about 17% of total light extinction at downtown Los Angeles. [Pg.244]

It is difficult to determine how much "dirtiness" is too much. Nearly everyone would agree that if more than half of the daily exposures exceed the standard (e > %) or if the long-term average exposure i s greater than the standard (x > 1), then the workplace is too dirty. There is much less agreement on when an environment is clean enough to be considered acceptable. [Pg.472]

Figure 1. Nine charts showing how the probability density function, pdf(x), and the long-term average exposure, x, vary as a function of e, the fraction of daily exposures that exceed the standard and GSD, the variability of the work environment. Figure 1. Nine charts showing how the probability density function, pdf(x), and the long-term average exposure, x, vary as a function of e, the fraction of daily exposures that exceed the standard and GSD, the variability of the work environment.
To put this into practical terms, recall that the only data available to an industrial hygienist are a small fraction of all possible samples no exposure is directly observable. The average of several industrial hygiene samples is a good estimate of the long-term average exposure, but the median and mode of sample data underestimate the median and mode of the true exposures. [Pg.475]

Figure 2. Contour plot showing the long-term average exposure, x, as a function of e, and GSD. The heavy line, where x = 0.95 PEL is the proposed Average Exposure Limit, A EL. Acceptable workplaces lie in the region to the left of the A EL. Each marks the location of one of the charts from Figure 1. Figure 2. Contour plot showing the long-term average exposure, x, as a function of e, and GSD. The heavy line, where x = 0.95 PEL is the proposed Average Exposure Limit, A EL. Acceptable workplaces lie in the region to the left of the A EL. Each marks the location of one of the charts from Figure 1.
In Table I, the NIOSH decision criteria is shown to have poor efficiency by the three dirty workplaces for which e = 0.6. In these cases, a worker would be exposed above the standard three days out of five, and his long-term average exposure would be greater than the 8-hour PEL. Very few people would disagree with the decision to call these workplaces NOT OK. Nevertheless, these workplaces will be declared NOT OK bv the NIOSH Action Level decision criteria only about 60% of the time. This inefficiency is further illustrated by the fact that only one of the three average workplaces with e = 0.2 has P( ) < 0.75. That one is (0.2, 1.13) and it also illustrates the conservativeness of the NIOSH criteria since on those infrequent occasions when a decision is made, the odds are 21 to 1 to decide NOT OK. However, Table I most clearly illustrates the conservativeness of the NIOSH criteria by the fact that P(0K) < 0.1 for the three clean workplaces where e = 0.024. [Pg.479]

Mutagenic activity can increase or decrease, depending on ozonation rate (11). Data on chlorination usually indicate an increase in mutagenic activity (12,13,14). However, the long-term average increase in mutagenic activity has not been well-defined. For this reason, this experiment was performed for a 1-year period to determine net changes in water quality. [Pg.608]


See other pages where Long-term averaging is mentioned: [Pg.108]    [Pg.203]    [Pg.204]    [Pg.569]    [Pg.358]    [Pg.358]    [Pg.321]    [Pg.290]    [Pg.40]    [Pg.221]    [Pg.76]    [Pg.77]    [Pg.135]    [Pg.194]    [Pg.1201]    [Pg.323]    [Pg.345]    [Pg.140]    [Pg.355]    [Pg.690]    [Pg.697]    [Pg.697]    [Pg.206]    [Pg.247]    [Pg.236]    [Pg.248]    [Pg.740]    [Pg.816]    [Pg.1201]    [Pg.472]    [Pg.473]    [Pg.478]    [Pg.1028]   


SEARCH



Average daily exposure, long-term

Exposure continued long-term average

Long-term averages

Long-term averages

© 2024 chempedia.info