Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Uncertainty tolerance

Tolerance verification it defines inspection planning and metrological procedures for functional requirements, functional specifications, and manufacturing specifications. It is very important to consider tolerance verification early in the design activities to be able to assess uncertainties. Tolerance verification permits to close the process loop, to check the product conformity, and to verify assumptions made by the designer. [Pg.1232]

For general-purpose temperature measurements, the user will generally select a commercial platinum RTD (i.e., an IPRT) which conforms to existing codes and standards for the relationship between temperature and resistance. Several different standards govern the performance of IPRTs. By far, the most widely used is the International Electrotechnical Commission (lEC) standard 60751, which is essentially equivalent to the Deutsches Institut fiir Normung (DIN) standard 43760 and the American Society for Testing and Materials (ASTM) standard El 137. The uncertainty tolerances on commercial RTD elements are commonly specified as Class A or... [Pg.2936]

In deciding on uncertainty tolerances the fitness for purpose criterion becomes crucially important. For example, when developing and validating a bioanalytical method, if the achievable accuracy and precision lead the analyst to conclude that the current method is not suitable relative to the original assay requirements, the method in its current form might be discarded altogether, used as the basis for further development of a method that does satisfy the specific criteria, and/or retained as a useful method that can be applied to studies conducted in a less demanding environment. [Pg.466]

Confirmation criteria should correspond to a specified uncertainty tolerance... [Pg.467]

The previous sections in this chapter addressed the process of validating the method and documenting the results in a final validation report. When reviewing the analytical data for the study samples the validation report and associated tables and supporting documentation must be carefully reviewed for accuracy and scientific content, to ensure that the necessary experiments have been run and that the data supports the original uncertainty tolerances (Section 9.3). In addition to the validation report, the final method must be reviewed and approved for sample analysis. Specific acceptance criteria that will be used during sample analysis should be documented in the approved method or equivalent SOP. [Pg.570]

Versluis, E. van Asselt, M.B j. . Fox, T. and Hommels, A. (2010) The EU Seveso regime in practice From uncertainty blindness to uncertainty tolerance ,/oMr7iaZ of Hazardous Materials 1S4 627-631. [Pg.91]

The maximum temperature cross which can be tolerated is normally set by rules of thumb, e.g., FrSQ,75 °. It is important to ensure that Ft > 0.75, since any violation of the simplifying assumptions used in the approach tends to have a particularly significant effect in areas of the Ft chart where slopes are particularly steep. Any uncertainties or inaccuracies in design data also have a more significant effect when slopes are steep. Consequently, to be confident in a design, those parts of the Ft chart where slopes are steep should be avoided, irrespective of Ft 0.75. [Pg.223]

Using the tolerance values for pipets and volumetric flasks given in Table 4.2, the overall uncertainties in Ma and Mb are... [Pg.70]

I. 000 X 10- 1.000 X 10-k 1.000 X 10-k and 1.000 X 10- M from a 0.1000 M stock solution. Calculate the uncertainty for each solution using a propagation of uncertainty, and compare to the uncertainty if each solution was prepared by a single dilution of the stock solution. Tolerances for different types of volumetric glassware and digital pipets are found in Tables 4.2 and 4.4. Assume that the uncertainty in the molarity of the stock solution is 0.0002. [Pg.131]

Economy of time and resources dictate using the smallest sized faciHty possible to assure that projected larger scale performance is within tolerable levels of risk and uncertainty. Minimum sizes of such laboratory and pilot units often are set by operabiHty factors not directly involving internal reactor features. These include feed and product transfer line diameters, inventory control in feed and product separation systems, and preheat and temperature maintenance requirements. Most of these extraneous factors favor large units. Large industrial plants can be operated with high service factors for years, whereas it is not unusual for pilot units to operate at sustained conditions for only days or even hours. [Pg.519]

Virtually all design parameters such as tolerances, material properties and service loads exhibit some statistical variability and uncertainty that influence the adequacy of the design. A key requirement in the probabilistic approach is detailed knowledge... [Pg.249]

Classicists believe that probability has a precise value uncertainty is in finding the value. Bayesians believe that probability is not precise but distributed over a range of values from heterogeneities in the database, past histories, construction tolerances, etc. This difference is subtle but changes the two approaches. [Pg.50]

When setting the goals of a measurement project, it has to be asked, What exactly has to be determined. What are the final quantities required and what is the inaccuracy that can be tolerated in these quantities Only when these factors are known can an analysis be made, where the quantities to be measured and the measurement accuracy of each quantity are defined. This analysis is based on the mea surement method selected, and on the computation of measurement uncertainties. Usually the analysis of measurement uncertainties is made after monitoring however, making it beforehand is part of good planning practice. This approach ensures that the correct information with the desired accuracy is achieved. [Pg.1120]

To properly use failure rate data, the engineer or risk analyst must have an understanding of failure rates, their origin and limitations. This chapter discusses the types and source of failure rate data, the failure model used in computations, the confidence, tolerance and uncertainties in the development of failure rates and taxonomies which can store the data and influence their derivation. [Pg.7]

Equipment failure rate data points carry varying degrees of uncertainty expressed by two measures, confidence and tolerance. Confidence, the statistical measurement of uncertainty, expresses how well the experimentally measured parameter represents the actual parameter. Confidence in the data increases as the sample size is increased. [Pg.11]

Tolerance uncertainty arises from the physical and the environmental differences among members of differing equipment samples when failure rate data are aggregated to produce a final generic data set. Increasing the number of sources used to obtain the final data set will most likely increase the tolerance uncertainty. [Pg.11]

However, the data that are contributed to a generic failure rate data base are rarely for identical equipment and may represent many different circumstances. Generic data must be chosen carefully because aggregating generic and plant-specific data may not improve the statistical uncertainty associated with the final data point, owing to change in tolerance. [Pg.12]

Uncertainty A measure of doubt that considers confidence and tolerance. [Pg.288]

Two properties, in particular, make Feynman s approach superior to Benioff s (1) it is time independent, and (2) interactions between all logical variables are strictly local. It is also interesting to note that in Feynman s approach, quantum uncertainty (in the computation) resides not in the correctness of the final answer, but, effectively, in the time it takes for the computation to be completed. Peres [peres85] points out that quantum computers may be susceptible to a new kind of error since, in order to actually obtain the result of a computation, there must at some point be a macroscopic measurement of the quantum mechanical system to convert the data stored in the wave function into useful information, any imperfection in the measurement process would lead to an imperfect data readout. Peres overcomes this difficulty by constructing an error-correcting variant of Feynman s model. He also estimates the minimum amount of entropy that must be dissipated at a given noise level and tolerated error rate. [Pg.676]

The initial sealing force, item 4, should be roughly known from initial design studies, but it may be subject to considerable uncertainty and must be based on an assumption of the worst combination of seal-housing tolerance variables. The residual sealing force may be estimated if there is knowledge of the stress-relaxation rate of the elastomer, using... [Pg.629]

A quality control laboratory had a certain model of HPLC in operation. One of the products that was routinely run on the instrument contained two compounds, A and B, that were quantitated in one run at the same detector wavelength setting. At an injection volume of 20 /tL, both compounds showed linear response. The relatively low absorption for compound B resulted in an uncertainty that was just tolerable, but an improvement was sought. [Pg.277]

The determination of an acceptable dose for humans involves the application of uncertainty factors to reflect the fact that, unlike the experimental animal, there is wide variability and susceptibility of response in the genetically diverse human population. Variations in gender, age, hormonal and disease status can affect the response to a chemical. In order to minimise any potential risks, uncertainty factors are applied to the NOAEL to arrive at a reduced exposure that is considered tolerable - namely the acceptable daily intake or ADI. These are usually tenfold for variations in susceptibility amongst the human population (the intra-species factor) and tenfold for the potential... [Pg.226]

The objective of this test was to present and analyze suitable experimental results for verif ying quantitatively the use of the above-mentioned three corrections with the W-3 correlation for predicting the DNB heat flux in a rod bundle. Uncertainties in the data due to instrument errors and heater rod fabrication tolerances... [Pg.439]

Table 5.11 Data uncertainties due to instrument errors and fabrication tolerances... [Pg.440]

The principles of quality assurance are commonly related to product and process control in manufacturing. Today the field of application greatly expanded to include environmental protection and quality control within analytical chemistry itself, i.e., the quality assurance of analytical measurements. In any field, features of quality cannot be reproduced with any absolute degree of precision but only within certain limits of tolerance. These depend on the uncertainties of both the process under control and the test procedure and additionally from the expense of testing and controlling that may be economically justifiable. [Pg.116]


See other pages where Uncertainty tolerance is mentioned: [Pg.24]    [Pg.462]    [Pg.465]    [Pg.466]    [Pg.541]    [Pg.1785]    [Pg.24]    [Pg.462]    [Pg.465]    [Pg.466]    [Pg.541]    [Pg.1785]    [Pg.64]    [Pg.69]    [Pg.69]    [Pg.285]    [Pg.56]    [Pg.44]    [Pg.405]    [Pg.410]    [Pg.15]    [Pg.328]    [Pg.106]    [Pg.155]    [Pg.26]    [Pg.251]    [Pg.313]    [Pg.220]   


SEARCH



Define Uncertainty Tolerance

Uncertainty tolerance validation

© 2024 chempedia.info