Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Engineering data, accuracy

Many forecasting techniques have been developed to handle a variety of problems. Each has its special advantage, and care is necessary in choosing techniques for cost estimating. Selection of a method depends on the context of the forecast, availability of historical data, accuracy desired, time period to be forecast, and value to the company. The engineer should adopt a technique that makes the best use of the data. He or she should initially use the simplest technique and not expect more from the advanced technique than is justified. [Pg.2310]

Data quality initiatives can help to ensure the accuracy of clinical/biomedical engineering data. The data needed to establish basic, accurate, maintainable automated records for medical equipment management include nomenclature, manufacturer, nameplate model, serial number. [Pg.267]

The GPS system provides new possibilities for the study of both wildlife and free roaming cattle. Although the first GPS collars were heavy and had engineering problems the present devices are reliable and can be adapted to any type of animal. For research purposes it is highly reconnmended to conduct a thorough data accuracy assessment, as positional error can exceed the nominal 15 m (49 ft). The use of... [Pg.185]

How many samples or measurements are required to ensure statistical accuracy is one of the most commonly asked questions in reverse engineering. This chapter will discuss this question by introducing the fundamental principles of statistics and their applications in data process and analysis. The reliability theory is closely related to statistics but was independently developed. This chapter will also discuss the applications of reliability theory, which is critical to reverse engineering data process and analysis in many cases. [Pg.209]

Since the accuracy of experimental data is frequently not high, and since experimental data are hardly ever plentiful, it is important to reduce the available data with care using a suitable statistical method and using a model for the excess Gibbs energy which contains only a minimum of binary parameters. Rarely are experimental data of sufficient quality and quantity to justify more than three binary parameters and, all too often, the data justify no more than two such parameters. When data sources (5) or (6) or (7) are used alone, it is not possible to use a three- (or more)-parameter model without making additional arbitrary assumptions. For typical engineering calculations, therefore, it is desirable to use a two-parameter model such as UNIQUAC. [Pg.43]

The timely acquisition of static and dynamic reservoir data is critical for the optimisation of development options and production operations. Reservoir data enables the description and quantification of fluid and rock properties. The amount and accuracy of the data available will determine the range of uncertainty associated with estimates made by the subsurface engineer. [Pg.125]

The accuracy of absolute risk results depends on (1) whether all the significant contributors to risk have been analyzed, (2) the realism of the mathematical models used to predict failure characteristics and accident phenomena, and (3) the statistical uncertainty associated with the various input data. The achievable accuracy of absolute risk results is very dependent on the type of hazard being analyzed. In studies where the dominant risk contributors can be calibrated with ample historical data (e.g., the risk of an engine failure causing an airplane crash), the uncertainty can be reduced to a few percent. However, many authors of published studies and other expert practitioners have recognized that uncertainties can be greater than 1 to 2 orders of magnitude in studies whose major contributors are rare, catastrophic events. [Pg.47]

With the Industrial Revolution, life became more complex but it was not until World War II that reliability engineering was needed to keep the complex airplanes, tanks, vehicles and ships operating. Of particular concern was the reliability of radar. Prior to this time equipment was known qualitatively to be reliable or unreliable. To quantify reliability requires collecting statistics on part failures in order to calculate the mean time to failure and the mean time to repair. Since then, NASA and the military has included reliability specifications in procurements thereby sustaining the collection and evaluation of data build statistical accuracy although it adds to the cost. [Pg.151]

It is easily possible to introduce refinements into the dilated van Laar model which would further increase its accuracy for correlating activity coefficient data. However, such refinements unavoidably introduce additional adjustable parameters. Since typical experimental results of high-pressure vapor-liquid equilibria at any one temperature seldom justify more than two adjustable parameters (in addition to Henry s constant), it is probably not useful for engineering purposes to refine Chueh s model further, at least not for nonpolar or slightly polar systems. [Pg.178]

Miller claims that if the basic equipment estimate has an accuracy of 10%, the most likely plant estimate should have an accuracy of 14%. This is much better than the ratio or Lang estimates, and considerably more accurate than Nichols said was possible with this type of data. 12 He claimed that there is a direct correlation between the cost of an estimate and its probable accuracy. Ever since he stated this in 1951, cost engineers have been trying to prove him wrong. [Pg.254]

At this point, we digress slightly to make some observations about the accuracy and precision of experimental data. Since we, as engineers, continuously make use of data that represent measurements of various... [Pg.35]

At present, there is no one computer fire code sufficiently comprehensive to compute this fire, including the people s response. In fact, no combination of present codes can solve this problem with the required engineering accuracy. To get an approximate illustrative solution to this case, a number of different computer fire codes must be used in succession and hand fit data transferred from one to the next. The computer programs used to make this (low accuracy) prediction and some of their often severe limitations will be indicated. [Pg.68]

As in any scientific or engineering endeavor, the quantity and validity of input data determine the accuracy of prediction. Frequent gauging of fluid levels in monitoring wells, flow rates, and oil-water ratios, in conjunction with proper quality control, can lead to accurate estimates that support proper project performance. [Pg.342]


See other pages where Engineering data, accuracy is mentioned: [Pg.312]    [Pg.311]    [Pg.432]    [Pg.30]    [Pg.283]    [Pg.496]    [Pg.18]    [Pg.232]    [Pg.381]    [Pg.68]    [Pg.439]    [Pg.768]    [Pg.142]    [Pg.456]    [Pg.142]    [Pg.29]    [Pg.159]    [Pg.160]    [Pg.125]    [Pg.153]    [Pg.443]    [Pg.307]    [Pg.380]    [Pg.1074]    [Pg.264]    [Pg.5]    [Pg.345]    [Pg.398]    [Pg.384]    [Pg.471]   


SEARCH



Data accuracy

Engineering Data

© 2024 chempedia.info