Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Processing, data errors

Rosenberg, J., R.S.H. Mah, and C. lordache, Evaluation of Schemes for Detecting and Identifying Gross Errors in Process Data, Indushial and Engineeiing Chemistiy, Reseaieh, 26(.3), 1987, 555-564. (Simulation studies of various detection methods)... [Pg.2545]

Expertise required to operate One of the objectives for using microprocessor-based predictive maintenance systems is to reduce the expertise required to acquire error-free, useful vibration and process data from a large population of machinery and systems within a plant. The system should not require user input to establish maximum amplitude, measurement bandwidths, filter settings, or allow free-form data input. All of these functions force the user to be a trained analyst and will increase both the cost and time required to routinely acquire data from plant equipment. Many of the microprocessors on the market provide easy, menu-driven measurement routes that lead the user through the process of acquiring accurate data. The ideal system should require a single key input to automatically acquire, analyze, alarm and store all pertinent data from plant equipment. This type of system would enable an unskilled user to quickly and accurately acquire all of the data required for predictive maintenance. [Pg.806]

Y - Xfia. If the number of input variables is greater than the number of observations, there is an infinite number of exact solutions for the least squares or linear regression coefficients, /3a. If the variables and observations are equal, there is a unique solution for /3a, provided that X has full rank. If the number of variables is less than the number of measurements, which is usually the case with process data, there is no exact solution for /3a (Geladi and Kowalski, 1986), but a can be estimated by minimizing the least-squares error between the actual and predicted outputs. The solution to the least-squares approximation problem is given by the pseudoinverse as... [Pg.35]

There are numerous advantages in running a laboratory through a LIMS, viz. the ability to capture, transfer and process data for a variety of instruments and to report the results in any specified format, the ability to use or construct databases (data banks) from any of the information generated, stored or accessible outside the laboratory, and the ability to log the progress and status of each sample that passes through the laboratory. The saving in time and effort for laboratory personnel and the relative freedom of data and results from human errors are further major benefits. LIMS can... [Pg.527]

Estimation of Measurement Error Variances from Process Data... [Pg.13]

Chemical process data inherently contain some degree of error, and this error may be random or systematic. Thus, the application of data reconciliation techniques allows optimal adjustment of measurement values to satisfy material and energy constraints. It also makes possible the estimation of unmeasured variables. It should be emphasized that, in today s highly competitive world market, resolving even small errors can lead to significant improvements in plant performance and economy. This book attempts to provide a comprehensive statement, analysis, and discussion of the main issues that emerge in the treatment and reconciliation of plant data. [Pg.16]

Parameter estimation is also an important activity in process design, evaluation, and control. Because data taken from chemical processes do not satisfy process constraints, error-in-variable methods provide both parameter estimates and reconciled data estimates that are consistent with respect to the model. These problems represent a special class of optimization problem because the structure of least squares can be exploited in the development of optimization methods. A review of this subject can be found in the work of Biegler et al. (1986). [Pg.25]

Most techniques for process data reconciliation start with the assumption that the measurement errors are random variables obeying a known statistical distribution, and that the covariance matrix of measurement errors is given. In Chapter 10 direct and indirect approaches for estimating the variances of measurement errors are discussed, as well as a robust strategy for dealing with the presence of outliers in the data set. [Pg.26]

Reliable process data are the key to the efficient operation of chemical plants. With the increasing use of on-line digital computers, numerous data are acquired and used for on-line optimization and control. Frequently these activities are based on small improvements in process performance, but it must be noted that errors in process data, or inaccurate and unreliable methods of resolving these errors, can easily exceed or mask actual changes in process performance. [Pg.94]

In the previous development it was assumed that only random, normally distributed measurement errors, with zero mean and known covariance, are present in the data. In practice, process data may also contain other types of errors, which are caused by nonrandom events. For instance, instruments may not be adequately compensated, measuring devices may malfunction, or process leaks may be present. These biases are usually referred as gross errors. The presence of gross errors invalidates the statistical basis of data reconciliation procedures. It is also impossible, for example, to prepare an adequate process model on the basis of erroneous measurements or to assess production accounting correctly. In order to avoid these shortcomings we need to check for the presence of gross systematic errors in the measurement data. [Pg.128]

Iordache, C., Mah, R., and Tamhane, A. (1985). Performance studies of the measurement test for detection of gross errors in process data. AIChE J. 31, 1187-1201. [Pg.150]

ESTIMATION OF MEASUREMENT ERROR VARIANCES FROM PROCESS DATA... [Pg.202]

Only a few publications in the literature have dealt with this problem. Almasy and Mah (1984) presented a method for estimating the covariance matrix of measured errors by using the constraint residuals calculated from available process data. Darouach et al. (1989) and Keller et al. (1992) have extended this approach to deal with correlated measurements. Chen et al. (1997) extended the procedure further, developing a robust strategy for covariance estimation, which is insensitive to the presence of outliers in the data set. [Pg.203]

This procedure (based on sample variance and covariance) is referred to as the direct method of estimation of the covariance matrix of the measurement errors. As it stands, it makes no use of the inherent information content of the constraint equations, which has proved to be very useful in process data reconciliation. One shortcoming of this approach is that these r samples should be under steady-state operation, in order to meet the independent sampling condition otherwise, the direct method could give incorrect estimates. [Pg.203]

Physical characterization of polymers is a common activity that research and development technologists at the Dow Chemical Company perform. A material property evaluation that is critical for most polymer systems is a tensile test. Many instruments such as an Instron test frame can perform a tensile test and, by using specialized software, can acquire and process data. Use of an extensometer eliminates calibration errors and allows the console to display strain and deformation in engineering units. Some common results from a tensile test are modulus, percent elongation, stress at break, and strain at yield. These data are then used to better understand the capabilities of the polymer system and in what end-use applications it may be used. [Pg.453]

Even once a method is standardized, erroneous results can still be generated. As a result, it is critical to have robust quality control procedures in place. Here, careful attention should be paid to identify opportunity for in-process control measures such as internal standards, calibration, control plates, replicates and so on as opposed to post-processing data review steps. Inline QC approaches allow sources of error to be identified and remedied much more rapidly and help limit costly re-tests, or the possibility of erroneous data leaving the laboratory. [Pg.22]

Control charts similar to the hand-drawn ones used earlier to illustrate the evaluation of processing data are also easily prepared using readily available software. Figure 16 is an x chart of tablet assay for active ingredient 2. Note that minimum maximum specification limits have been included. Figure 17 depicts a traditional x control chart for dissolution to which error bars have been added to denote individual tablet assays for each batch. [Pg.110]

To process data obtained by application of orthogonal second-order designs, regression coefficient significances are checked by expression (2.144), along with previous calculations of variance or error in determining regression coefficients ... [Pg.376]

Quality control experts, such as Shewhart and Deming, decades ago called attention to the importance of minimizing measurement errors in achieving quality control in manufacturing and the importance of statistical monitoring of process data. But tying these concepts to equipment calibration in a statistically rigorous way did not occur until the second half of the twentieth century. [Pg.103]


See other pages where Processing, data errors is mentioned: [Pg.80]    [Pg.2545]    [Pg.2545]    [Pg.2572]    [Pg.127]    [Pg.605]    [Pg.258]    [Pg.228]    [Pg.23]    [Pg.454]    [Pg.26]    [Pg.133]    [Pg.151]    [Pg.215]    [Pg.264]    [Pg.387]    [Pg.389]    [Pg.246]    [Pg.803]    [Pg.147]    [Pg.147]    [Pg.554]    [Pg.3]    [Pg.198]    [Pg.156]    [Pg.40]    [Pg.23]   
See also in sourсe #XX -- [ Pg.41 ]




SEARCH



Data processing

Errors from data processing

Errors in data processing

Process data

© 2024 chempedia.info