Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Outlier processing

Autocorrelation function of a power signal, definition, 103 Automatic processing of standard data, outlier processing, 38-43... [Pg.276]

During measurements, PCDMIS software collects the coordinates of the center of the ball stylus. Later, data are transferred to a MATLAB file for their processing. Virtual circles are constructed and the value of the circle center, diameter and distance are calculated. Each virtual circle is measured 15 times for each virtual circle and gauge position (at least 10 repetitions after outliers processing). A minimum of 80 measurements are taken for each virtual circle. The temperature of the environment is controlled within 20 1 °C. [Pg.72]

The same setup is followed for the complementary test, distance between virtual circles. The operator of Laboratory 1 measures both sets of virtual circles, 15 times per virtual circle and position and, after outliers processing, at least 10 repetitions remains. A minimum of 80 distances are taken for each pair of virtual circle. [Pg.72]

Note on GMPs The assays are conducted on individual dosage units (here tablets) and not on composite samples. The CU test serves to limit the variability from one dosage unit to the next (the Dissolution Rate test is the other test that is commonly used). Under this premise, outlier tests would be scientific nonsense, because precisely these outliers contain information on the width of the distribution that one is looking for. The U.S. vs. Barr Laboratories Decision makes it illegal to apply outlier tests in connection with CU and DR tests. This does not mean that the distribution and seemingly or truly atypical results should not be carefully investigated in order to improve the production process. [Pg.238]

Identification of outliers is not a straightforward process. Even when observations have been diagnosed as outlying, one should not automatically discard these, certainly not when the evidence is not overwhelming. Ideally, one should... [Pg.374]

The development of a calibration model is a time consuming process. Not only have the samples to be prepared and measured, but the modelling itself, including data pre-processing, outlier detection, estimation and validation, is not an automated procedure. Once the model is there, changes may occur in the instrumentation or other conditions (temperature, humidity) that require recalibration. Another situation is where a model has been set up for one instrument in a central location and one would like to distribute this model to other instruments within the organization without having to repeat the entire calibration process for all these individual instruments. One wonders whether it is possible to translate the model from one instrument (old or parent or master. A) to the others (new or children or slaves, B). [Pg.376]

Barnett et al. (1994) define outliers as the observations in a sample which appear to be inconsistent with the remainder of the sample. In engineering applications, an outlier is often the result of gross measurement error. This includes a mistake in the calculations or in data coding or even a copying error. An outlier could also be the result of inherent variability of the process although chances are it is not ... [Pg.133]

To construct the reference model, the interpretation system required routine process data collected over a period of several months. Cross-validation was applied to detect and remove outliers. Only data corresponding to normal process operations (that is, when top-grade product is made) were used in the model development. As stated earlier, the system ultimately involved two analysis approaches, both reduced-order models that capture dominant directions of variability in the data. A PLS analysis using two loadings explained about 60% of the variance in the measurements. A subsequent PCA analysis on the residuals showed that five principal components explain 90% of the residual variability. [Pg.85]

Process data for the same polymer recipe were analyzed for 50 noncon-secutive, sequential batches. As before, the data were preprocessed to remove outliers and sorted to reflect normal operation. The data are sampled at 1-minute intervals during production of each batch. Fillering and normalization of the data is done prior to analysis. The final polymer quality... [Pg.87]

After inspecting the tabular and graphic data, the operator is allowed to remove runs which appear to be outliers. Any run can be deleted or restored in any order, and the comparative statistics are recalculated with each operation. By comparing the standard deviation before and after deleting a run, the effect of that run can by determined. The editing process can continue indefinitely until the operator is satisfied with the validity of his results. [Pg.126]

The resulting [Ca/Fe] versus [Fe/H] plot is shown in Fig. 1, where except for a few outliers that will have to be manually inspected, a clear trend appears [Ca/Fe] slowly rises with [Fe/H] until it reaches a maximum and then declines again for the most metal-rich stars (RGB-a according to [4]). This nicely confirms a previous finding by [8] and [9]. If the metal-rich stars have evolved within the cluster in a process of self-enrichment, the only way to lower their a-enhancement would be SNe type la intervention. No simple explanation is provided for the rise of [Ca/Fe] at low [Fe/H], although a series of star formation bursts should be the likely cause. [Pg.108]

Outlier detection, in chemometrics, 6 56-57 Outokumpu flash smelting, 16 146 Outokumpu lead smelting process, 14 745 Outokumpu Oy process, selenium recovery via, 22 83... [Pg.659]

Most techniques for process data reconciliation start with the assumption that the measurement errors are random variables obeying a known statistical distribution, and that the covariance matrix of measurement errors is given. In Chapter 10 direct and indirect approaches for estimating the variances of measurement errors are discussed, as well as a robust strategy for dealing with the presence of outliers in the data set. [Pg.26]

Only a few publications in the literature have dealt with this problem. Almasy and Mah (1984) presented a method for estimating the covariance matrix of measured errors by using the constraint residuals calculated from available process data. Darouach et al. (1989) and Keller et al. (1992) have extended this approach to deal with correlated measurements. Chen et al. (1997) extended the procedure further, developing a robust strategy for covariance estimation, which is insensitive to the presence of outliers in the data set. [Pg.203]

A multivariate normal distribution data set was generated by the Monte Carlo method using the values of variances and true flowrates in order to simulate the process sampling data. The data, of sample size 1000, were used to investigate the performance of the robust approach in the two cases, with and without outliers. [Pg.212]

Let us consider the same chemical reactor as in Example 11.1 (Chen et al., 1998). Monte Carlo data for y were generated according to in order to simulate process sampling data. A window size of 25 was used here, and to demonstrate the performance of the robust approach two cases were considered, with and without outliers. [Pg.232]

Xr from the robust approach, as expected, still gives the correct answer however, the conventional approach fails to provide a good estimate of the process variables. Although the main part of the data distribution is Gaussian, the conventional approach fails in the task because of the presence of just one outlier. In a strict sense, the presence of this outlier results in the invalidation of the statistical basis of data reconciliation,... [Pg.232]

Chen, J., Bandoni, A., and Romagnoli, J. A. (1998). Outlier detection in process plant data. Comput. Chem. Eng. 22,641-646. [Pg.244]

There are four main strategies concerning the processing of outliers. Figures lb to le give a graphical Interpretation of these strategies. [Pg.37]

There are several different but interdependent factors which will influence significantly the whole process of handling outliers. One must consider the distinctions... [Pg.38]

Figures 2 to 4 describe the recommended procedure for processing outliers. These flowcharts could be used also to create a computer program. The explanation of some of the terms used in these charts follows ... Figures 2 to 4 describe the recommended procedure for processing outliers. These flowcharts could be used also to create a computer program. The explanation of some of the terms used in these charts follows ...

See other pages where Outlier processing is mentioned: [Pg.320]    [Pg.244]    [Pg.276]    [Pg.661]    [Pg.117]    [Pg.374]    [Pg.134]    [Pg.52]    [Pg.52]    [Pg.124]    [Pg.170]    [Pg.67]    [Pg.250]    [Pg.193]    [Pg.480]    [Pg.496]    [Pg.575]    [Pg.78]    [Pg.431]    [Pg.150]    [Pg.474]    [Pg.37]    [Pg.39]    [Pg.41]    [Pg.45]    [Pg.48]   


SEARCH



Outlier

Outlier processing strategy

© 2024 chempedia.info