Objective quality


Disciplined Approach to Problem Definition Specific technical objectives Quality function deployment or similar process Acquisition of background, literature, etc Networked to many technology resources Scientific Method  [c.134]

Depending on your strategy, quality systems should enable you to achieve all your quality goals. Quality systems have a similar purpose to financial control systems, information technology systems, inventory control systems, and personnel management systems. They organize resources so as to achieve certain objectives through processes which, if implemented and maintained, will yield the desired results. Whether it is the management of costs, inventory, personnel, or quality, systems are needed to focus the thought and effort of people towards prescribed objectives. Quality systems focus on the quality of what the organization produces, the factors which will cause the organization to achieve its goals, the factors which might prevent it satisfying customers, and the factors which might prevent it from being productive, innovative, and profitable. Quality systems should therefore cause conforming product and prevent nonconforming product.  [c.41]

The operations group will develop general operating and maintenance objectives for the facilities which will address product quality, costs, safety and environmental issues. At a more detailed level, the mode of operations and maintenance tor a particular project will be specified in the field development plan. Both specifications will be discussed in this section, which will focus on the input of the production operations and maintenance departments to a field development plan. The management of the field during the producing period is discussed in Section 14.0.  [c.278]

One of the primary objectives of production operations is to deliver product at the required rate and quality. Therefore the product quality specification and any agreed contract terms will drive the activities of the production operations department, and will be a starting point for determining the preferred mode of operation. The specifications, such as delivery of stabilised crude with a BS W of less than 0.5%, and a salinity of 70 g/m,  [c.279]

The mechanical performance of equipment is likely to deteriorate with use due to wear, corrosion, erosion, vibration, contamination and fracture, which may lead to failure. Since this would threaten a typical production objective of meeting quality and quantity specifications, maintenance engineering provide a service which helps to safely achieve the production objective.  [c.286]

The choice of contract type will depend upon the type of work, and the level of control which the oil company wishes to maintain. There is a current trend for the oil company to consider the contractor as a partner in the project (partnering arrangements), and to work closely with the contractor at all stages of the project development. The objective of this closer involvement of the contractor is to provide a common incentive for the contractor and the oil company to improve quality, efficiency, safety, and most importantly to reduce cost. This type of contract usually contains a significant element of sharing risk and reward of the project.  [c.301]

The classical computer tomography (CT), including the medical one, has already been demonstrated its efficiency in many practical applications. At the same time, the request of the all-round survey of the object, which is usually unattainable, makes it important to find alternative approaches with less rigid restrictions to the number of projections and accessible views for observation. In the last time, it was understood that one effective way to withstand the extreme lack of data is to introduce a priori knowledge based upon classical inverse theory (including Maximum Entropy Method (MEM)) of the solution of ill-posed problems [1-6]. As shown in [6] for objects with binary structure, the necessary number of projections to get the quality of image restoration compared to that of CT using multistep reconstruction (MSR) method did not exceed seven and eould be reduced even further.  [c.113]

The penetration of microwaves in various materials gives active microwave imaging a large potential for subsurface radar, civil engineering etc. Several inverse-scattering theories have been proposed in the scientific literature. Among them, the simplest Bom-type approach, which does not take into account multiple reflections, is valid for weakly scattering objects [1-2]. To improve the quality of reconstruction, the method based on the successive application of the perturbative algorithm was developed [3]. However, the inherent approximations of this approach are not overcome in the iterative scheme. Another class of algorithms aims to obtain the spatial distribution of permittivity by using numerical solutions of exact equations [4-6]. Unfortunately, a rate of convergence of the solution to the global minimum of cost function depends on actual contrast values, measurement error etc. That is why an importance of a priori knowledge about the object imder investigation is usually emphasized. In general, existing inversion algorithms suffer from serious problems when discontinuous profiles of high contrast, which are often encountered in practical applications, are to be reconstructed. Moreover, the frequency-swept imaging methods utilize usually reflection coefficient data measured in a very broad frequency band starting from zero frequency [1-2, 4-5]. Such methods are inappropriate from an application point of view.  [c.127]

Abstract All imaging applications for radiography and computerised tomography (CT), which are based on conventional, poly-energetic X-rays uses - with very few exceptions -non-optimal parameter settings for the imaging task at issue. This is due to the complex relationship between imaging parameter settings and final image quality that makes it empirieally tedious to find optimal parameter settings - particularly for industrial applications, because of the wide range of objects, geometry and material, of interest.  [c.208]

Within this work mathematical models of the data collection process for radiography and CT have been developed. The objective has been to develop a functioning simulation environment of the physics in the image/data collection that considers the poly-energetic dependence in the imaging process, including full X-ray energy spectra and detector energy response. The simulation environment is used for fast and cost-effective studies of how parameter variations affect final image quality and indirect - the defect detectability. In this particular case, the simulations have been applied on a high resolution CT application to determine optimal operator parameter settings that maximise the detectability of different defect types in circular objects and to predict the minimum detectable size as a function of object diameter. The simulation environment has also been used to correct for beam hardening artefacts in CT-images.  [c.208]

Of all NTD methods for quality control of materials, products, welded and soldered joints the most informative and perspective are radioscopic ones that enable to obtain a visual image of an inner structure of a tested objects in real time under any projection.  [c.449]

Due to the better sensitivity of IP-ND and IP the neutron exposure times could be reduced for a factor about 125 in comparison with the Gd screen/radiographic SR film method. In the later technique neutron exposure time of more than 90 min in the thermal neutron flux of 4.5 10 n cm s are required, while for the obtaining a neutron image of comparable quality using IP-ND the exposure time could reduced to only 10-40 seconds In Fig. 1 NR images of a Incite step wedge, Fe step wedge and BPI and SI neutron standard objects obtained with an IP-ND at neutron fluence of about 1.13 lO cm (Fig. la) and 5.6 10 cm (Fig. lb) are presented. The corresponding values of the recorded signal of the free neutron beam are 655 PSL/mm and only 26 6 PSL/mm . respectively. Both images are underexposed for factor of about 1.5 and 38 respectively. However, the image on Fig. la clearly reveals all three holes in the first 6 steps of the Fe wedge. The thickness of steps varies from 1-9 as 3 mm, 2 mm, 1.5 mm, 1.0 mm, 0.57 mm, 0.5 mm, 0.375 mm, 0.25 mm and 0.125 mm respectively. In the step 7 only 2T and 4T holes (T=step thickness, corresponding hole diameters 0.7 mm and 1.5 mm respectively) and in the step 8 the hole 4T are visible. Ail steps in the Incite calibration wedge can be resolved. Here the film neutron radiograph using single coated fine grained radiographic film and Gd metal screen obtained at neutron fluence of about 2 10 cm reveals also the holes 2T and 4T in the step 8. The NR image obtained with IP-ND at neutron fluence of only 5.6 10 cm still reveals all 3 holes in the first 4 steps and first 6 steps in the Incite wedge. The images of BPI and SI neutron standards can still be used for the neutron beam quality evaluation. The neutron image at so low fluence is rather noisy, however it demonstrates the feasibility of direct NR even with extremely weak neutron sources.  [c.508]

In NDT the key question is the reliability of the methods, verified by validation. Regarding the visual methods, the reliability, that means probability of the visual recognition depends on the optical abilities of the scene with the objects. These abilities were determined quantitatively with an image-processing system for fluorescent magnetic particle inspection. This system was used, determining quality of detection media, following the type testing of the european standard prEN 9934-2. Another application was the determination of the visibility in dependance of the inspection parameters as magnetization, application of the detection medium, inclination of the test surface, surface conditions and viewing conditions. For standardized testing procedures and realistic parameter variations the probability of detection may be determined on the basis of physiological lightening in dependance of the defect dimensions.  [c.669]

NDT as an objective method, which can be applied for 100% control if necessary, is used for quality assessment of critical components all over the world. But it is necessary to support quality of the implementation of NDT to ensure the detection of defects and to achieve reliable results.  [c.953]

The specific character of NDT related to the quality assessment of safety critical products and objects requires constant analysis and continuous improvement of processes and their interconnection. Sometimes interaction of processes is very complicated (Figure 3) therefore the processes have to be systematized and simplified when possible to realize total quality management in NDT.  [c.954]

A comprehensive approach to the quality assurance in NDT would combine both approaches. It creates and maintains conditions for the people protection, safe operation of objects and products and preservation of the environment.  [c.956]

The main task of the non-destructive testing is availability of objective and reliable information about the object being examined in assessment of its quality. Calibration of NDT equipment before the testing realization, adjustment of  [c.958]

The quality may suffer from the presence of so-called outliers, i.e., compounds that have low similarity to the rest of the dataset. Another negative feature may be just the contrary the dataset may contain too many too highly similar objects.  [c.205]

Once the quality of the dataset is defined, the next task is to improve it. Again, one has to remove outliers, find out and remove redundant objects (as they deliver no additional information), and finally, select the optimal subset of descriptors.  [c.205]

The "feedback loop in the analytical approach is maintained by a quality assurance program (Figure 15.1), whose objective is to control systematic and random sources of error.The underlying assumption of a quality assurance program is that results obtained when an analytical system is in statistical control are free of bias and are characterized by well-defined confidence intervals. When used properly, a quality assurance program identifies the practices necessary to bring a system into statistical control, allows us to determine if the system remains in statistical control, and suggests a course of corrective action when the system has fallen out of statistical control.  [c.705]

Two additional aspects of a quality control program deserve mention. The first is the physical inspection of samples, measurements and results by the individuals responsible for collecting and analyzing the samples. For example, sediment samples might be screened during collection, and samples containing foreign objects, such as pieces of metal, be discarded without being analyzed. Samples that are discarded can then be replaced with additional samples. When a sudden change in the  [c.707]

Miniature Electrical Components. In the manufacture of miniature transformers and motor armatures, it is often necessary that the winding be electrically isolated from the bobbin. The insulating varnish of fine copper wire often does not survive the rigors of the winding operation, and, particularly in the case of the smaller devices, the bobbin requires insulation. However, a thicker insulation than is absolutely necessary takes up space that otherwise could be used for more turns of wire. Thus, thickness control in the coating of the bobbin means better performance of the finished device. Parylene is used in the manufacture of high quality miniature stepping motors, such as those used in wristwatches, and as a coating for the ferrite cores of pulse transformers, magnetic tape-recording heads, and miniature inductors, where the abrasiveness of the ferrite is particularly damaging. In the coating of complex, tiny objects such as these, the VDP process has an extra labor-saving advantage. It is possible to coat thousands of such articles simultaneously by tumbling them during the VDP operation (65).  [c.442]

Nearly every chemical manufacturiag operation requites the use of separation processes to recover and purify the desired product. In most circumstances, the efficiency of the separation process has a significant impact on both the quality and the cost of the product (1). Liquid-phase adsorption has long been used for the removal of contaminants present at low concentrations in process streams. In most cases, the objective is to remove a specific feed component alternatively, the contaminants are not well defined, and the objective is the improvement of feed quality defined by color, taste, odor, and storage stability (2-5) (see Wastes, industrial Water, industrial watertreati nt).  [c.291]

J. J. Powers and H. R. Moskowitz, /. Soc. Test. Mater. Spec. Tech. Puhl 594, 35 (1974) R. A. Scanlan, ed.. Flavor Quality Objective Measurement ACS Symposium Series No. 51, Washington, D.C., 1977.  [c.7]

Weed Management Strategies. The paradigm that all noncrop plant populations in a field should be controlled, regardless of the actual impact on crop yield and quality, is not justifiable. The objective determination a priori of which plant populations require control and which do not, direcdy reduces the economic, environmental, and social costs associated with weed control and can be considered an innovative approach to weed management. For example, some noncrop plant populations do not significantly hinder production. In some developing areas of the world, producers have found uses for noncrop plants that would otherwise be considered weeds (451), and many weeds are both edible and nutritious (452). In aquaculture systems, certain highly problematic algal and bacterial weeds are also essential to the overall stability and productivity of the production system (453).  [c.55]

Transfer function models are linear in nature, but chemical processes are known to exhibit nonhnear behavior. One could use the same type of optimization objective as given in Eq. (8-26) to determine parameters in nonlinear first-principle models, such as Eq. (8-3) presented earlier. Also, nonhnear empirical models, such as neural network models, have recently been proposed for process applications. The key to the use of these nonlinear empirical models is naving high-quality process data, which allows the important nonhnearities to be identified.  [c.725]

Typical problems in chemical engineering process design or plant operation nave many possible solutions. Optimization is concerned with selecting the best among the entire set by efficient quantitative methods. Computers and associated software make the computations involved in the selection feasible and cost-effective. Engineers work to improve the initial design of equipment and strive for enhancements in the operation of the equipment once it is installed in order to realize the most production, the greatest profit, the maximum cost, the least energy usage, and so on. In plant operations, benefits arise from improved plant performance, such as improved yields of valuable products (or reduced yields of contaminants), reduced energy consumption, higher processing rates, and longer times between shutdowns. Optimization can also lead to reduced maintenance costs, less equipment wear, and better staff utilization. It is helpful to systematically identify the objective, constraints, and degrees of freedom in a process or a plant if such benefits as improved quality of designs, faster and more reliable troubleshooting, and faster decision making are to be achieved.  [c.741]

Regulatory Controls The objective of this layer is to operate the process at or near the targets supphed by others, be it the process operator or a higher layer in the hierarchy. In order to achieve consistent process operations, a high degree of automatic control is required from the regulatory layer. The direct result is a reduction in variance in the key process variables. More uniform product quality is an obvious benefit. However, consistent process operation is a prerequisite for optimizing the process operations. To ensure success for the upper level functions, the first objective of any automation effort must be to achieve a high degree of regulatory control.  [c.771]

Clean Air Act of 1970 The Clean Air Act of 1970 was founded on the concept of attaining National Ambient Air Quahty Standards (NAAQS). Data were accumulated and analyzed to establish the quality of the air, identify sources of pollution, determine how pollutants disperse and interac t in the ambient air, and define reduc tions and controls necessary to achieve air-quahty objectives.  [c.2155]

The "feedback loop in the analytical approach is maintained by a quality assurance program (Figure 15.1), whose objective is to control systematic and random sources of error.The underlying assumption of a quality assurance program is that results obtained when an analytical system is in statistical control are free of bias and are characterized by well-defined confidence intervals. When used properly, a quality assurance program identifies the practices necessary to bring a system into statistical control, allows us to determine if the system remains in statistical control, and suggests a course of corrective action when the system has fallen out of statistical control.  [c.705]

The penetration of unleaded fuels will continue rapidly in Europe in the coming years. Figure 5.8 portrays a scenario predicting the complete disappearance of leaded motor fuels by 2000-2005. Regular gasolines will also be eliminated very soon. The remaining uncertainty for unleaded fuels is the distribution between Eurosuper and Superplus. From a strictly regulatory point of view, the predictions lean towards a predominance of Eurosuper, defined by the European Directive of 20 March 1985, still in force. Nevertheless, the automobile manufacturers desire to have available, abundantly if possible, a motor fuel of high quality so that they can more easily reach their objective of reducing fuel consumption in the early years following 2000.  [c.210]

The most modem model of this type is the tomograph Ultrafast CT SIC 311 by Siemens. This system for scanning 3D objects needs the minimum time 50msec for a slice at thickness 7mm. Number of inspected slicss can get out by the operator (standardly - 40). It is also possible to choose a mode of scanning step volume scanning with 40 slices in 30 sec or continuous scanning with 40 slices in 18.6 sec. There is a possibility to change the time of inspection for a slice. These parameters influence quality and resolution in obtained reconstmcted image [4].  [c.216]

Industrial radiography occupies an enviable position in tlte field of Non Destructive Testing (NDT). Its main assets are a unique image quality, providing the possibility to trace even the smallest defects, a high reliability and permanently archivable results. Nonetheless, the growing environmental awareness, the more rigid ecological legislation and the increasing costs for waste disposal may threat the enviable position of industrial radiography. The main challenge for the manufacturer of industrial radiography consists in meeting the new ecological demands without endangering its acknowledged qualities. Even more, we think that on the longer term, radiography is best served when we manage to combine the ecological needs with an additional progress in terms of user-friendliness and speed. At Agfa, the combination of ecology with a further improvement of the assets of radiography, is one of the main objectives, driving the innovations we introduce in the market of industrial radiography. In the past, a great number of actions have been undertaken by Agfa to minimize the environmental burden Apart from these actions, which have largely been implemented, a number of ecological improvements are planned. In this paper we want to focus on one of these the strategy to tackle the problem of silver in the rinsing water.  [c.604]

The main peculiarity of NDT and TD means for safety development is in their continuous in-tellectualization. The objective high level information about the inspected object condition, state of environment and about other events may be obtained with the help of complex systems. Such systems may be integrated basing on different methods corresponding to special physical phenomena. Diagnostic systems may be combined in accordance with different physical investigation methods. Still the main problem exists and that is the statistical data base establishing. In this data base the information about defects, accidents or emergency situations must be kept as well as the development results of scientific physical-mathematical base of quality metering, that is, the quantitative estimation of objects and media quality and condition. At the same time the task of multidimensional and multiparametric information processing must be solved. The information that is received from various physical fields and phenomena, due to the improved models of transducers, measuring channels and algorithms.  [c.915]

Microscopes are imaging systems and, hence, the image quality is detennined by lens errors, by structures in the image plane (e.g., picture elements of CCD cameras) and by diffraction. In addition, the visibility of objects with low contrast suffers from various noise sources such as noise in the illuminating system (shot noise), scattered light and by non-imifomiities in the recording media. Interest often focuses on the achievable resolution, and discussions on limits to microscopy are then restricted to those imposed by diffraction (the so-called Abbe limit), assuming implicitly that lenses are free of errors and that the visual system or the image sensors are ideal. However, even under these conditions the Abbe limit of the resolution may not be reached if the contrast is insufficient and noise is high.  [c.1656]

Under appropriate contrast and high light intensity, the resolution of planar object structures is diffraction limited. Noise in the microscopic system may also be important and may reduce resolution, if light levels and/or the contrasts are low. This implies that the illumination of the object has to be optimal and that the contrast of rather transparent or highly reflecting objects has to be enlianced. This can be achieved by an appropriate illumination system, phase- and interference-contrast methods and/or by data processing if electronic cameras (or light sensors) and processors are available. Last but not least, for low-light images, efforts can be made to reduce the noise either by averaging the data of a multitude of images or by subtracting the noise. Clearly, if the image is inspected by the eye, the number of photons, and hence the noise, are detemimed by the integration time of the eye of about 1/30 s signal/noise can then only be improved, if at all possible, by increasing the light intensity. Hence, electronic data acquisition and processing can be used advantageously to improve image quality, since integration times can significantly be extended and noise suppressed.  [c.1659]

An example of how redundancy in a dataset influences the quality of learning follows. The problem implied classification of objects in a dataset through their biological activities with the help of Kohonen Self-Organizing Maps. Three subgroups were detected. The first one contained highly active compounds, the second subgroup comprised compounds with low activity, and the rest, the intermediately active compounds, fell into the third subgroup. There were 91 highly active, 540 intermediately active, and 492 inactive compounds. As one can see, the dataset was not balanced in the sense that the intermediately active and the inactive compounds outnumbered the active compounds. A first attempt of balancing the dataset by means of a mechanical (i.e., by chance) removal of the intermediately active and the inactive compounds deaeased the quality of learning. To address this problem, an algorithm for finding and removing redundant compounds was elaborated.  [c.207]

Usually, the denominator, if present in a similarity measure, is just a normalizet it is the numerator that is indicative of whether similarity or dissimilarity is being estimated, or both. The characteristics chosen for the description of the objects being compared are interchangeably called descriptors, properties, features, attributes, qualities, observations, measurements, calculations, etc. In the formiilations above, the terms matches and mismatches" refer to qualitative characteristics, e.g., binary ones (those which take one of two values 1 (present) or 0 (absent)), while the terms overlap and difference" refer to quantitative characteristics, e.g., those whose values can be arranged in order of magnitude along a one-dimensional axis.  [c.303]

Water Quality Assessment. Assessments of the effects of effluents on receiving streams until the mid-1970s were more subjective than objective (32,33). Since then, changes in attitude within the aquatic life-science field and the regulatory system have necessitated a restmcturing of the design and foundations for effluent-impact assessments in receiving waters. For example, government regulations have proceeded from a system of stream standards-based regulations to effluent-based regulations involving strict requirements on various poUution parameters. Pressure is being exerted to go back to the receiving water system as the ultimate test for new and more stringent discharge-control measures. The increasing knowledge of the interrelationships between the various biological, chemical, and physical components of aquatic systems has provided significant restmcturing of field assessment programs which are designed to analyze effluent impact. A single organism as an indicator of stream quaHty has been replaced by community compositional and stmctural analysis. Thus, total effluent effects on a broad scale can be realized. Other measured parameters are algal assays, fish surveys, sediment mapping, plume mapping, sediment oxygen demand, and socioeconomic impacts.  [c.12]

The visual sharpness of a recorded image is yet another subjective measure of image quality. The impression of sharpness or crispness is achieved when the boundaries and edges of the objects composing the image are clear and well defined. When high resolution of fine detail is required in the image, then, in addition to contrast and granularity, sharpness becomes a particularly important image quality parameter. The presence of a light-scattering grain population, resulting from a given set of emulsion-making conditions, can have a pronounced effect on the photographic response and the quality of the photographic image. The light-scattering and light-absorption properties of the silver halide grains within a coated layer estabflsh the vertical and lateral distribution of light within the layer (264,327—330), which significantly affects the sharpness of the photographic image. In addition to the effects of the lateral scattering of light by emulsion grains, the optics of camera lens systems and the optics of enlargers also influence the path of photons and the apparent sharpness of the final photographic image. The ability of a photographic material to record fine detail is a function of development effects as well as of optical effects. The lateral diffusion of development by-products can produce adjacency effects that enhance the apparent sharpness of a recorded image.  [c.460]

Sharpness is evaluated by a number of methods. It is often measured as the ability of a recorder to produce an image of very narrow and closely spaced lines. In such an analysis, the resolving power of a recording film is deterrnined by photographing a test object composed of a series of alternating black-and-white lines of increasing narrowness set in geometric patterns. The last visually distinguishable set of lines is recorded, in lines per millimeter, as the resolving power of the recorder under the particular test conditions. The resolving power of a photographic material is deterrnined by granularity and contrast as well as by effects of image spread. The modulation transfer function (MTF) is a more objective and quantitatively interpretable measure of the quality of sharpness. For MTF analyses, the recorder is exposed with a band of light that varies sinusoidally in intensity. The frequency of the sine wave continuously increases along the length of the exposure band. This spatial frequency is 2—500 cycles/mm. The density modulation of a microdensitometrie trace along the resultant image can be expressed as the percentage of input modulation at each frequency (34). The MTF is independent of the processing conditions for a linear recorder and can be translated mathematically through Fourier analysis to the spread function. The spread function represents the spatial distribution of light within a coating that results from the isotopic scattering of an infinitesimal incident point of light. Analysis of MTF data can also be used to deduce the image density distribution produced by knife-edge exposures it is in general more reflable than direct analyses of such edge exposures.  [c.460]

Clinical chemistry, initiated in the 1940s, involves the biochemical testing of body fluids to provide objective information on which to base clinical diagnosis. Between 1946 and 1975 clinical testing had a period of rapid growth in the United States, when the volume of tests increased by about 15% a year. The ever-increasing demand for high quality, routine clinical testing stimulated the development of automated techniques, and as early as the 1960s automation in the clinical laboratory was the rule rather than the exception.  [c.391]

Regulatory Direction FPA and states are directed by the Clean Water Act to develop programs to meet the Act s stated objective 4o restore and maintain the (memical, physical, and biological integrity of the Nation s waters [Sec. 101 (a)]. Efforts to date have emphasized clean water quite literally, by focusing on the chemical makeup of discharges ana their compliance with chemical water-quality standards established for surface water bodies. These programs have successfully addressed many water-pollution problems, but they are not sufficient to identify and address all of them. A large gap in the current regulatory scheme is the absence of a direct measure of the condition of the biological resources that we are intending to protect (the biological integrity of the water body). Without such a measure it will be difficult to determine whether our water-management approaches are successful in meeting the intent of the Act. Taken together, chemical, physical, and biological integrity are equivalent to the ecological integrity of a water body. It is highly likely the future will see interpretation of biological integrity and ecological integrity, and the management of ecological systems, assuming a much more prominent role in water-quality management.  [c.2162]

Nowadays in the light of finding the ways of solution of ecological problems, with the improvements in standards of micro quantity definition of metals in new materials, with the intensification in food products and water quality standards the problem of limits of micro quantity determination of metals and the accuracy of the definition methods becomes important. Lead, cadmium and chromium are super toxicants and their concentration in waste, waters, air, soil and food products has to be controlled. One of the basic methods of these elements definition in objects of complex chemical composition is the atomic absorption spectrometry. At present customary water solutions become inferior to other mediums. Aqua-organic and non-aqueous solutions have been used for a long period of time currently the use of ultra decisive fluids, ionic liquids and so-called organized mediums such as aqua micellar surfactant solutions is more common. Therefore the object of presented work is the sensibility and selectivity intensification of lead, cadmium and chromium atomic absorption determination as well as the establishment of possibility of air-acetylene flame-low-temperature flame (propane-butane-air) transition. The systematic research of surfactants (kation, anion, non-ion) nature and concentration influence on analytical signal in atomic absolution determination of lead, cadmium and chromium has been successfully conducted on models. The established results have been compared. The mixtures of modifiers that maximize the analytical signal have been sorted. The surfactant-based modifier addition to analyzed solutions reduces viscosity, surface stretching and the drop size of sprayed solution, boosts the effectiveness of spraying and transforms oxidation-reduction eminence of flame. The ion redistribution of defined components takes place, which leads to their saturation primarily in small size drops. The sensibility of metals atomic absolution determination increases in 2 or 3 times. Mineral acids and other attendant components influence on lead, cadmium and chromium analytical signal during their atomic absolution determination in low-temperature flame has been studied. The modifier addition to analyzed solutions intensifies selectivity of atomic absolution determination. The modifier-based methodology of cadmium and chromium atomic absolution determination in food products, of chromium in sewage waters. The detection limit has been evaluated.  [c.160]

Assessment of the quality of results involved the criteria proposed in the international programs of Global geochemical mapping and Proficiency testing in geoanalytical sciences GeoPT. Besides, we proposed the criterion considering heterogeneity of microelements distribution in natural study objects.  [c.169]


See pages that mention the term Objective quality : [c.63]    [c.278]    [c.1655]    [c.1658]    [c.564]   
Automotive quality systems handbook (2000) -- [ c.103 ]