Aaberg receiving


In reality, the biending index of a compound varies according to its concentration and the nature of the product receiving it it is not, therefore, an intrinsic characteristic. In spite of this problem, refiners have long used the concept of blending index to predict and establish their refining flow sheets based on data drawn from their own experience. This approach is disappearing except in certain cases, for example, concerning the addition of oxygenates. In this manner, Table 5.11 gives estimated blending values for different alcohols and ethers when they are added in small quantities to an unleaded fuel close to the specifications for Eurosuper (RON 95, MON 85). Taking into account the diversity of situations encountered in regards to the composition of the receiving product stream, one does not retain a unique value for the blending value, but on the contrary, a margin for possible variation.  [c.203]

A technique that is more sensitive to corrosion has been developed by inducing eddy currents normal to the surface. These eddy currents, rather than flowing around the corrosion, enter the irregularities of the material surface left by corrosion (fig I). Using a suitable receiver coil arrangement one can now get an impedance change that is a function of the local rate of change, i. e. the gradient of the metal not yet removed by corrosion. Since corrosion leaves behind a very jagged metal surface, this approach is inherently more sensitive to this detection.  [c.283]

Fonning an image by scanning the laser spot across the sample, or vice versa, minimizes the light dose received by each molecule and reduces photobleaching. The tradeoff is that it requires some time to gather an image. A fluorescent image can be obtained much more rapidly by irradiating a larger area in the transverse plane and imaging the emission from the entire area at once onto a two-dimensional photodetector. This approach is most useful for highly-photostable molecules at low temperatures [34, 35 and 36]- Photobleaching can be further reduced by employing an automatic positioning system with feedback to locate and centre the excitation on a single molecule as rapidly as possible [32] and also by excluding oxygen [38] and/or working at very low temperatures where most chromophores are more stable, although the latter adds considerable complexity to the experimental configuration [39, 40 and 41]  [c.2489]

While the classical approach to simulation of slow activated events, as described above, has received extensive attention in the literature and the methods are in general well established, the methods for quantum-dynamical simulation of reactive processes in complex systems in the condensed phase are still under development. We briefly consider electron and proton quantum dynamics.  [c.15]

The effect of ligands on the endo-exo selectivity of Lewis-acid catalysed Diels-Alder reactions has received little attention. Interestingly, Yamamoto et al." reported an aluminium catalyst that produces mainly exo Diels-Alder adduct. The endo-approach of the diene, which is normally preferred, is blocked by a bulky group in the ligand.  [c.91]

Equilibrium chemistry often receives a significant emphasis in the introductory analytical chemistry course. While an important topic, its overemphasis can cause students to confuse analytical chemistry with equilibrium chemistry. Although attention to solving equilibrium problems is important, it is equally important for students to recognize when such calculations are impractical, or when a simpler, more qualitative approach is all that is needed. For example, in discussing the gravimetric analysis of Ag+ as AgCl, there is little point in calculating the equilibrium solubility of AgCl since the concentration of Ch at equilibrium is rarely known. It is important, however, to qualitatively understand that a large excess of Ch increases the solubility of AgCl due to the formation of soluble silver-chloro complexes. Balancing the presentation of a rigorous approach to solving equilibrium problems, this text also introduces the use of ladder diagrams as a means for providing a qualitative picture of a system at equilibrium. Students are encouraged to use the approach best suited to the problem at hand.  [c.814]

In some countries, eg, Finland, Hungary, Pern, Poland, and Bulgaria, one must petition the appropriate government agency and receive permission to manufacture or seU a flavor or a flavored product. Newly emerging countries, although they have no specific regulations, often accept other countries legislation. This is often the case when dealing with the countries of Africa.  [c.19]

When an atom or molecule receives sufficient thermal energy to escape from a Hquid surface, it carries with it the heat of vaporization at the temperature at which evaporation took place. Condensation (return to the Hquid state accompanied by the release of the latent heat of vaporization) occurs upon contact with any surface that is at a temperature below the evaporation temperature. Condensation occurs preferentially at all poiats that are at temperatures below that of the evaporator, and the temperatures of the condenser areas iacrease until they approach the evaporator temperature. There is a tendency for isothermal operation and a high effective thermal conductance. The steam-heating system for a building is an example of this widely employed process.  [c.511]

Nonchemical or traditional practices, such as weed seed removal, optimal crop seeding rates, crop selection, enhanced crop competitiveness, crop rotation, and mechanical weed control are all important components of an effective weed management program (458,459). In the context of modern intensive chemical herbicide appHcation, nonchemical practices may even represent an innovative approach to weed management and should receive careful consideration.  [c.55]

The U.S. GAO has also reviewed pesticide standards and regulations among member countries of the expanded European Union and the Organization for Economic Cooperation and Development (OECD) (Table 5) (72). A high degree of uniformity exists among the surveyed nations, including the United States, with regard to the kinds of test data required to register food-use pesticides. However, similar data requirements do not necessarily mean that countries receive the same information about a pesticide product or evaluate it in a similar manner. For example, there is a divergence of scientific opinion concerning what regulatory approach is most appropriate for dealing with substances that show some oncogenic effects (tumors) only at very high, near-lethal doses as compared to those that cause cancer through a genotoxic mechanism (63). Also, most other countries do not require  [c.149]

Because of the nutritive value, phosphates have been impHcated in promoting the growth of algae in lakes. Problems apparendy caused by sewage-home phosphates are mosdy localized to areas that have traditionally employed lakes as receiving waters for sewage efduents. It is beheved that much of the phosphate is precipitated in an insoluble form and trapped in sediments where it is ultimately converted to an apatite. Considerable controversy has centered on the contribution of phosphate-built detergents to excessive algae growth and subsequent eutrophication of natural receiving water. Legislation against the use of phosphates in detergents has resulted in a patchwork of restrictions woddwide. Home laundry detergents have been the most regulated. Societal pressure has resulted in the voluntary reduction or elimination of phosphates in many cleaning products by the manufacturers. It is open to question, however, as to whether a banning of phosphate detergents and cleaners can indeed sufficiendy reduce phosphoms input to the low levels needed to control algal growth, when, in fact, natural wastes and fertilizers provide most of the phosphoms input to receiving waters. A more logical but also more cosdy approach is phosphoms removal during sewage treatment. Excellent reviews of this area are available (36,37).  [c.345]

V. L. Anderson and R. A. McLean, Design of Experiments—A Eea/istic Approach, Marcel Dekker, New York, 1974. This book provides an extensive exposition of experimental design at a relatively elementary level. It includes most of the standard material, as well as detailed discussions of such subjects as nested and spHt-plot experiments. Restrictions on randomization receive special emphasis.  [c.524]

The payback-period method takes no account of cash flows or profits received after the breakeven point has been reached. The method is based on the premise that the earher the fixed capital is recovered, the better the project. However, this approach can Be misleading.  [c.808]

Equilibrium chemistry often receives a significant emphasis in the introductory analytical chemistry course. While an important topic, its overemphasis can cause students to confuse analytical chemistry with equilibrium chemistry. Although attention to solving equilibrium problems is important, it is equally important for students to recognize when such calculations are impractical, or when a simpler, more qualitative approach is all that is needed. For example, in discussing the gravimetric analysis of Ag as AgCl, there is little point in calculating the equilibrium solubility of AgCl since the concentration of Cl at equilibrium is rarely known. It is important, however, to qualitatively understand that a large excess of Cl increases the solubility of AgCl due to the formation of soluble silver-chloro complexes. Balancing the presentation of a rigorous approach to solving equilibrium problems, this text also introduces the use of ladder diagrams as a means for providing a qualitative picture of a system at equilibrium. Students are encouraged to use the approach best suited to the problem at hand.  [c.811]

One attractive approach to the problem of a lake or reservoir receiving large nutrient loads from dispersed sources or from diffuse non-point sources, is to manage the hydrography of the system.Provided that the waterbody is deep enough to stratify naturally, that the volume of bottom water is at least as great as the surface mixed layer and that light penetrates to less than half of the fully-mixed volume, destratification is likely to be effective.  [c.38]

In the early years of molecular dynamics simulations of biomolecules, almost all scientists working in the field received specialized training (as graduate students and/or postdoctoral fellows) that provided a detailed understanding of the power and limitations of the approach. Now that the methodology is becoming more accessible (in terms of ease of application of generally distributed programs and the availability of the required computational resources) and better validated (in terms of published results), many people are beginning to use simulation technology without training in the area. Molecular dynamics simulations are becoming part of the tool kit used by everyone, even experimentalists, who wish to obtain an understanding of the structure and function of biomolecules. To be able to do this effectively, a person must have access to sources from which he or she can obtain the background required for meaningful applications of the simulation methodology. This volume has an important role to play in the transition of the field from one limited to specialists (although they will continue to be needed to improve the methodology and extend its applicability) to the mainstream of molecular biology. The emphasis on an in-depth description of the computational methodology will make the volume useful as an introduction to the field for many people who are doing simulations for the first time. They will find it helpful also to look at two earlier volumes on macro-molecular simulations [3,4], as well as the classic general text on molecular dynamics [6]. Equally important in the volume is the connection made with X-ray, neutron scattering, and nuclear magnetic resonance experiments, areas in which molecular dynamics simulations are playing an essential role. A number of well-chosen special topics involving applications of simulation methods are described. Also, several chapters broaden  [c.516]

The three intersections were the results of the mathematical approach when a low heat removal condition existed assuming the steady-state first and then solving the algebraic equations. This resulted in 3 solutions. If the transient differential equations for material and temperature were integrated in the time domain until the time derivative vanished, only the upper and lower solution were received. Which one was the actual operating point depended on which end was considered to be the starting point of the experiment. At a low temperature start, the lower point became stabilized. Once ignition occurred and temperature moved to the higher range, lowering the temperature resulted in stabilizing in the upper state.  [c.198]

In the basic continuous adsorption cycle illustrated previously in Fig. 5b two adsorbent beds are heated and cooled out of phase in order to provide continuous heating or cooling. It is possible to recover some of the adsorption heat rejected by each bed and use it to provide some of the heat required by the other bed. This might be done by the use of a circulating heat transfer fluid or a heat pipe. Meunier [5] first systematically looked at the potential gain in COP that might be obtained by such heat recovery, both as a function of the approach temperature of the beds donating and receiving heat and of the number of beds. The number of beds is not limited to two and the COP increases with the number of beds that it is possible to transfer heat between. There is of course a practical limitation, but it is possible to calculate the theoretical benefit of employing heat exchange between any number of beds.  [c.323]

ESP-r is a building and plant energy simulation environment based on a numerical approach in which building and plant energy flows and their interconnections are represented. - In particular, the system is able to integrate hear, air, moisture, daylighting, and power flows, while including special components such as photovoltaic cells, advanced glazings, and renewable technologies. For advanced lighting applications, ESP-r is able to pass (automatically) its building representation to the RADIANCE daylighting simulation code and receive back, for example, the internal illuminance distribution as an input to a user-specified lighting controller.  [c.1097]

One that got away was the magnetic bubble memory, also described by Yeack-Scranton in 1994. A magnetic bubble is a self-contained, cylindrical magnetic domain polarised in direction opposite to the magnetization of the surrounding thin magnetic film, typically a rare-earth iron garnet deposited on a nonmagnetic substrate. Information is stored by creating and moving strings of such bubbles, each about a micrometre across, accessed by means of a magnetic sensor. This very ingenious and unconventional approach received a great deal of research attention for several years in the 1970s it was popular for a time for such devices as handheld calculators, but in the end the achievable information density and speed of operation were insufficiently attractive. In the field of magnetic memories, competition is red in tooth and claw.  [c.287]

Seismic surveys involve the generation of artificial shock waves which propagate through the overburden rock to the reservoir targets and beyond, being reflected back to receivers where they register as a pressure pulse (in hydrophones - offshore) or as acceleration (in geophones - onshore). The signals from reflections are digitised and stored tor processing and the resulting data reconstructs an acoustic image of the subsurface for later interpretation. The objective of seismic surveying is to produce an acoustic image of the subsurface, with as much resolution as possible, where all the reflections are correctly positioned and focused and the image is as close to a true geological picture as can be. This of course is an ideal, but modern (3D and 4D) techniques allow us to approach this ideal.  [c.17]

For solving a problem of 3D reconstruction the wide range of various methods is offered. There is an approach, which reduces cone-beam reconstruction to a series 2D reconstruction procedures, carried out in pceudo-parallel planes. It results in loss of quality of the received images, especially at increase of the sizes of object. However the algorithms of this group have, as a rule, high parameters on time needed for reconstruction. The examples of such methods ate algorithms of Feldcamp [5], Herman [7], Grangeat [10].  [c.217]

In order to excite a single wavelength, an interdigital transducer would require an infinite number of fingers and so would be infinitely large. Practical transducers have a finite number of fingers and it is desirable to limit the wavelength bandwidth which they excite. Fig 2a shows a possible electrode pattern for a transducer designed to excite the a] mode at its non-dispersive point in a 1 mm thick aluminium plate. It comprises two alternating sets of fingers which are driven differentially, the spacing between successive fingers connected to the same supply rail being one wavelength (2.4 mm) and the width of each finger being 1 mm. The wavelength excitation bandwidth may be reduced further by apodisation. In SAW devices this is frequently achieved by varying the finger length [8]. However, it was thought that this approach may increase the spreading of the Lamb wave beam in the applications of interest here so the width of the fingers was varied while keeping the length constant as shown in Fig 2b. The 15 dB down wavenumbers for the transducer of Fig 2b are 2.17 and 3.06 rad/mm, corresponding to wavelengths of 2.9 and 2.1 mm respectively. If similar transducers are used for transmission and reception, the overall sensitivity at these wavelengths will be 30 dB down from the peak sensitivity at a wavelength of 2.4 mm. The transducer will therefore excite and receive modes at significant amplitude over the region between the X/d = 2.9 and X/d = 2.1 lines in Fig la. For a constant input, the apodisation will reduce the amplitude of the wave generated at the centre wavelength since the total electrode area is reduced. However, this is unlikely to be a serious problem in most applications.  [c.716]

Studies of wave packet motion in excited electronic states of molecules with tliree and four atoms were conducted by Schinke, Engel and collaborators, among others, mainly in the context of photodissociation dynamics from the excited state [142. 143 and 144] (for an introduction to photodissociation dynamics, see [7], and also more recent work [145, 146, 147. 148 and 149] with references cited therein). In these studies, the dissociation dynamics is often described by a time-dependent displacement of the Gaussian wave packet in the multidimensional configuration space. As time goes on, this wave packet will occupy different manifolds (from where the molecule possibly dissociates) and this is identified with IVR. The dynamics may be described within the Gaussian wave packet method [150], and the vibrational dynamics is then of the classical IVR type (CIVR [M])- The validity of this approach depends on the dissociation rate on the one hand, and the rate of delocalization of the wave packet on the other hand. The occurrence of DIVR often receives less attention in the discussions of photodissociation dynamics mentioned above. In [148], for instance, details of the wave packet motion by means of snapshots of the probability density are missing, but a delocalization of the wave packet probably takes place, as may be concluded from inspection of figure 5 therein.  [c.1063]

We recently received a preprint from Dellago et al. [9] that proposed an algorithm for path sampling, which is based on the Langevin equation (and is therefore in the spirit of approach (A) [8]). They further derive formulas to compute rate constants that are based on correlation functions. Their method of computing rate constants is an alternative approach to the formula for the state conditional probability derived in the present manuscript.  [c.265]

Vdver and Pond Sediments. Much of the work on polychlorinated biphenyls has focused on the remediation of aquatic sediments, particularly from rivers, estuaries, and ponds. As noted above, a few of the most lightly chlorinated compounds are mineralized under aerobic conditions, but the more chlorinated species seem completely resistant to aerobic degradation, even by white rot fungi (54). On the other hand, there is extensive dechlorination of highly chlorinated forms under anaerobic conditions, particularly methanogenic conditions. Bioremediation thus requites anaerobic and aerobic regimes. Intrinsic biodegradation of polychlorinated biphenyls can be recognized by the changing "fingerprint" of the individual isomers as biodegradation proceeds (41,55). The anaerobic dechlorination of the most recalcitrant congeners can apparendy be primed by adding a readily dehalogenated congener, such as 2,5,3, 4 -tetrachlorobiphenyl (56), but whether this is a reaUstic approach for in situ bioremediation remains to be seen. Harkness and co-workers (57) have successhiUy stimulated aerobic biodegradation in large caissons in the Hudson River by adding inorganic nutrients, biphenyl, and hydrogen peroxide, but found that repeated addition of a polychlorinated-biphenyl degrading bacterium A.lcaligenes eutrophus H850) had no beneficial effect. Essentially no biodegradation occurred in the stirred control caissons, but losses on the order of 40% were seen in the caissons that received nutrients and peroxide, regardless of whether the stirring was aggressive or rather gentle. Whether this approach can be scaled-up for large-scale use, with a net environmental benefit, remains to be seen.  [c.34]

Smolder-Resistant Upholstery Fabric. Chemical finishing to improve the smolder resistance of cotton fabric received some attention during the late 1970s. It was thought that regulatory activity would impact the market position of cotton upholstery fabric research on semidurable and durable finishes capable of withstanding occasional scmbbing was initiated for cotton upholstery fabric. Two chemical treatments are of particular interest. In one system, a formulation comprised of borax, an acidic compound, and TMM was appHed by conventional pa dding as well as by low wet add-on techniques (112,113). Another successful approach consisted of the appHcation of various polymers as backcoatings to upholstery fabric (114). The most effective polymers for this purpose were copolymers in which one of the monomers contained halogen. One particularly effective copolymer contained butadiene, styrene, and vinyHdene chloride. Although the use of synthetic barrier fabrics have largely supplanted the need for this type of finishing, these finishes did provide an effective means for making cotton smolder resistant.  [c.490]

The hazard analysis critical control point (HACCP) concept is a systematic approach to the identification, assessment, prevention, and control of hazards. The system offers a rational approach to the control of microbiological, chemical, environmental, and physical hazards in foods, avoids the many weaknesses inherent in the inspectional quaUty control approach, and circumvents the shortcomings of rehance on microbiological testing (33,34). The food industry and government regulatory agencies are placing greater emphasis on the HACCP system to provide greater assurance of food safety. In the 1970s and early 1980s, the HACCP approach was adopted by large food companies and began to receive attention from segments of the food industry other than manufacturing. Reports by the International Commission on Microbiological Specifications for Foods (ICMSF) revealed a growing international awareness of the HACCP concept and its usefiilness in dealing with food safety (35).  [c.33]

For radiation doses <0.5 Sv, there is no clinically observable iacrease ia the number of cancers above those that occur naturally (57). There are two risk hypotheses the linear and the nonlinear. The former implies that as the radiation dose decreases, the risk of cancer goes down at roughly the same rate. The latter suggests that risk of cancer actually falls much faster as radiation exposure declines. Because risk of cancer and other health effects is quite low at low radiation doses, the iacidence of cancer cannot clearly be ascribed to occupational radiation exposure. Thus, the regulations have adopted the more conservative or restrictive approach, ie, the linear hypothesis. Whereas nuclear iadustry workers are allowed to receive up to 0.05 Sv/yr, the ALARA practices result ia much lower actual radiatioa exposure.  [c.243]

Positive imaging can be accompHshed by several different techniques. In positive imaging, the density produced by developed silver or dye molecules decreases with increasing exposure. If an original negative is on a transparent film base, a 1 1 positive copy can be produced by contact printing on a negative print material, or an optically magnified positive copy can be prepared by using an enlarger. In addition to such negative—positive approaches to producing positive images, positive or reversal imaging in the originally exposed coating is also possible. One way to generate reversal images in color films is to use specially designed processing sequences. If black-and-white development is used to develop aU of the exposed grains, then a positive dye image can be produced by first chemically activating the remaining silver haUde and then developing the activated grains with color developers. In the latter approach the silver haUde grains are of the same negative-working type as those used in negative—positive materials. Negative-working silver haUde grains are also used to produce the positive images provided by instant photography products (28). After exposure in instant photography systems, the film is passed between a pair of rollers which mpture a reagent-containing pod. The reagents include development initiating chemicals that are released from the pod and uniformly spread within the film stmcture. Upon reaching the silver hahde-containing layers, the reagents chemically promote the imagewise diffusion of silver ions in the case of black-and-white products or dyes in color products. The diffusing species are ultimately trapped in a receiving layer for subsequent viewing (see Color photography, instant).  [c.452]

Continued exposure to light can convert enough silver haUde photolyticaHy to metallic silver to produce a satisfactory image. This approach to photographic imaging is used for specialized appHcations (295). Eor certain emulsion grains, photolytic amplification under low irradiance produces a visual density, provided the photolytic amplification was preceded by a high irradiance exposure. In regions of the light-sensitive element that received high irradiance pre-exposures, photodevelopment produces a higher covering power (ratio of optical density to developed silver) compared with background areas. Electron micrograph examinations of the image and background areas reveal that the covering power effects result from a higher dispersity of developed specks per grain in the image areas. The discrimination between the image and the background of photodeveloped films may be a consequence of color differences resulting from the diverse shapes of the developed silver particles. The images produced by photodevelopment generally are not permanent however, specially formulated bathing treatments have been devised to stabilize the photodeveloped images. Photodevelopment materials provide a rapid, solution-free approach for recording data from the output of oscilloscopes and other scientific instmmentation.  [c.456]

Operation. Operatioas are coatroUed from a ceatral computerized office that maintains communication with compressor statioas and flow stations along the route reports are also received from weekly or biweekly aircraft flyovers to iaspect for poteatiady threatening coaditioas, such as sod erosioa, floods, approachiag excavatioa, or any external evidence of leaks. Flow computers along the pipeline route monitor and quantify flow from producers entering the pipeline or flow leaving the line to customers. Pipeline flow, pressure, and other operating variables can be controUed by commands to compressor stations. In systems that have remote control of main-line valves, central control can isolate sections of pipelines with reported or threatened problems and dispatch personnel to the site. If patrols detect iadications of significant iacreases ia populatioa deasity ia specific areas, the safety factor ia the desiga formula may be iacreased ia accordance with iadustry code, and the maximum allowable operating pressure ia those areas is reduced. Operatioas and maintenance of pipelines are also covered ia ASME B31.8 and federal regulation 49 CFR 192.  [c.50]

Moreover, commercially available triblock copolymers designed to be thermoplastic elastomers, not compatihilizers, are often used in Heu of the more appealing diblock materials. Since the mid-1980s, the generation of block or graft copolymers in situ during blend preparation (158,168—176), called reactive compatibilization, has emerged as an alternative approach and has received considerable commercial attention.  [c.415]

Other blends such as polyhydroxyalkanoates (PHA) with cellulose acetate (208), PHA with polycaprolactone (209), poly(lactic acid) with poly(ethylene glycol) (210), chitosan and cellulose (211), poly(lactic acid) with inorganic fillers (212), and PHA and aUphatic polyesters with inorganics (213) are receiving attention. The different blending compositions seem to be limited only by the number of polymers available and the compatibiUty of the components. The latter blends, with all natural or biodegradable components, appear to afford the best approach for future research as property balance and biodegradabihty is attempted. Starch and additives have been evaluated ia detail from the perspective of stmcture and compatibiUty with starch (214).  [c.482]

The refrigerant-recirculating pump pressurizes the refrigerant Hquid and moves it to one or more evaporators or heat exchangers that may be remote from the receiver. The low pressure refrigerant may be used as a single-phase heat-transfer fluid as in A of Figure 11, which eliminates the extra heat-exchange step and increased temperature difference encountered in a conventional system that uses a secondary refrigerant or brine. This approach may simplify the design of process heat exchangers where large specific volumes of evaporating refrigerant vapor would be troublesome. Alternatively, the pumped refrigerant in the flooded system may be routed through conventional evaporators as in B and C, or special heat exchangers as in D. The flooded refrigeration system is helpfljl when special heat exchangers are necessary for process reasons, or where multiple or remote exchangers are required.  [c.67]

Water. Eor a long time in the United States, the approach to water pollution control was through the estabUshment of water quaUty standards for receiving bodies of water, ie, rivers, streams, or lakes, with most limits estabUshed on a state-by-state basis. There was no effective, national, legal authority to limit the discharge of pollutants. In the late 1960s, the U.S. government revived an old law, the Rivers and Harbor Act of 1899 (the Refuse Act) (1). The law prohibited the discharge of anything into navigable waters unless a permit was obtained from the Corps of Engineers, thus providing a first step toward control of industrial discharges. This was followed by additional legislation, culminating in the passage of the Eederal Water Pollution Control Act Amendments (FWPCA) of 1972 and the Clean Water Act (CWA) of 1977 (2). The objective of the FWPCA was to restore and maintain the chemical, physical, and biological integrity of the nation s waters.  [c.76]

The practice of estabHshing empirical equations has provided useflil information, but also exhibits some deficiencies. Eor example, a single spray parameter, such as may not be the only parameter that characterizes the performance of a spray system. The effect of cross-correlations or interactions between variables has received scant attention. Using the approach of varying one parameter at a time to develop correlations cannot completely reveal the tme physics of compHcated spray phenomena. Hence, methods employing the statistical design of experiments must be utilized to investigate multiple factors simultaneously.  [c.333]

The modem approach to wastewater treatment, protection of the oxygen resources of the receiving waters, requires that all aspects of the problem be addressed, ie, the systems approach. The Ohio River Sanitation Commission (ORSANCO) is an excellent example of basin-wide management dealing with situations that involve several poUtical entities. This approach has been adopted in several other regions.  [c.286]

Although the Fischer-Tropsch reaction was the focus of innumerable studies prior to 1960, it received Htde further attention until the energy crisis of the 1970s spurred new activity in the area. A particularly significant technology resulting from this work was a process for converting synthesis gas or methanol into aromatic compounds (44,45). The Fischer-Tropsch synthesis is carried out commercially in South Africa, where an unusual economic environment allows a profitable operation. Fischer-Tropsch technology has not found appHcation in the United States (46).  [c.52]

Another approach involves the formation of dye images from colorless oxichromic developers, leuco a2omethines stabilized against premature oxidation by acylation and linked to developer moieties (30), as shown in Figure 6c. The transferred images are oxidized to the colored form either by aerial oxidation or by oxidants present in the receiving layer.  [c.490]

Analytical measurements of ozone concentrations must be made in the ozonized gas from the ozone generator, the contactor off-gases, and the residual ozone level in the ozonized water. Methods of ozone measurement commonly used are the simple "sniff" test, Draeger-type detector tube, wet chemistry potassium iodide method, amperometric-type instruments, gas-phase chemiluminescence, and ultraviolet radiation adsorption. The use of control systems based on these measurements varies considerably. The key to successful operation is an accurate and reliable residual ozone analyzer. Continuous residual ozone monitoring equipment may be successfully applied to water that has already received a high level of treatment. However, a more cautious approach must be taken with the application of continuous residual ozone monitoring equipment for water that has only received chemical clarification because the ozone demand has not yet been satisfied and the residua] is not as stable. Ozone production must be closely controlled because excess ozone cannot be stored. Changes in process demand must be responded to rapidly. Ozone production is costly underozonation may produce undesired effects and overozonation may require additional costs where off-gas destruction is used.  [c.494]

CFR 1910.121, OSEIA Accreditation of Training Programs for Hazardous Waste Operations (proposed) and the nonmandatory Appendix E to the HAZWOPER standard, Suggested Training Curriculum Guidelines, are recommended for site-specific implementation. These nonmandatory guidelines provide a common-sense approach to help management choose the appropriate programs. Wlien considering an outside contractor, you should always include logistics. This part of the selection process is important. Very simply put Can the outside training contractor provide my workers with instruction that is convenient for workers to attend and that will be completed before the work tasks begin After the basic need for logistics has been met, the next and most important step should be considered. Will workers receive quality training and be provided the information they need in a format that they will retain and use If these two basic needs are not met, more careful consideration and research need to be implemented. To assist in making this determination, the nonmandatory requirements already mentioned should prove helpful. The types of subjects that are discussed in these nonmandatory appendices include  [c.98]


See pages that mention the term Aaberg receiving : [c.166]    [c.339]    [c.238]    [c.286]    [c.275]    [c.1216]    [c.1605]    [c.263]    [c.768]   
Industrial ventilation design guidebook (2001) -- [ c.1448 , c.1485 ]