Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Validation reconciliation

To determine if a process unit is at steady state, a program monitors key plant measurements (e.g., compositions, product rates, feed rates, and so on) and determines if the plant is steady enough to start the sequence. Only when all of the key measurements are within the allowable tolerances is the plant considered steady and the optimization sequence started. Tolerances for each measurement can be tuned separately. Measured data are then collec ted by the optimization computer. The optimization system runs a program to screen the measurements for unreasonable data (gross error detection). This validity checkiug automatically modifies tne model updating calculation to reflec t any bad data or when equipment is taken out of service. Data vahdation and reconciliation (on-line or off-line) is an extremely critical part of any optimization system. [Pg.742]

Rectification accounts for systematic measurement error. During rectification, measurements that are systematically in error are identified and discarded. Rectification can be done either cyclically or simultaneously with reconciliation, and either intuitively or algorithmically. Simple methods such as data validation and complicated methods using various statistical tests can be used to identify the presence of large systematic (gross) errors in the measurements. Coupled with successive elimination and addition, the measurements with the errors can be identified and discarded. No method is completely reliable. Plant-performance analysts must recognize that rectification is approximate, at best. Frequently, systematic errors go unnoticed, and some bias is likely in the adjusted measurements. [Pg.2549]

Validation versus Rectification The goal of both rectification and validation is the detecI ion and identification of measurements that contain systematic error. Rectification is typically done simultaneously with reconciliation using the reconciliation resiilts to identify measurements that potentially contain systematic error. Vahdation typically rehes only on other measurements and operating information. Consequently, vahdation is preferred when measurements and their supporting information are hmited. Further, prior screening of measurements limits the possibihty that the systematic errors will go undetected in the rectification step and subsequently be incorporated into any conclusions drawn during the interpretation step. [Pg.2566]

Spreadsheet Analysis Once validation is complete, prescreening the measurements using the process constraints as the comparison statistic is particularly usenil. This is the first step in the global test discussed in the rectification section. Also, an initial adjustment in component flows will provide the initial point for reconciliation. Therefore, the goals of this prescreening are to ... [Pg.2566]

The program must require the vendors to measure a number of reference samples and/or duplicates submitted in a planned sequence. It should require prompt measurement and reporting of these data and should maintain the results in a control chart format. Prompt feedback and follow-up of any apparent data discrepancies and reconciliation of the results with control charts maintained by the vendors are required to minimize the length of uncertain performance. The quality assurance plan should include random sampling of the vendors data for their validity and conformance with quality assurance requirements. If quality assurance is properly practiced at all levels, an inspection of 5 percent of the total data output should be adequate. [Pg.106]

Data processing conditioning, reconciliation, and validation of the data evolving from the plant. [Pg.517]

We find the answers to the four questions in the course of the data quality assessment, which is the scientific and statistical evaluation of data to determine if data obtained from environmental data operations are of the right type, quality, and quantity to support their intended use (EPA, 1997a). Part of DQA is data evaluation that enables us to find out whether the collected data are valid. Another part of the DQA, the reconciliation of the collected data with the DQOs, allows us to establish data relevancy. Thus, the application of the entire DQA process to collected data enables us to determine the effect of total error on data usability. [Pg.8]

D1 Data review, verification, and validation D2 Validation and verification methods D3 Reconciliation with user requirements... [Pg.79]

The reconciliation of data with the DQOs is necessary because data validity does not assure data relevancy and usability. Regardless of how excellent the quality of a data set may be, if the data are not relevant, they cannot be used for project decisions. [Pg.289]

The performance of an RTO system depends on the ability of the model to accurately reflect plant behavior, which is observed via process measurements that can be unreliable and subject to disturbances. Therefore, plant measurements are processed and validated in the data validation subsystem before they are used for model updating. The data validation process consists of three steps 1) steady-state detection 2) measurement filtering and 3) data reconciliation and gross error detection. [Pg.2590]

Where component reconciliation, in warehouse and production, requires a knowledge that labelled box quantities have been QC validated. [Pg.80]

The FDA recently confirmed that market feedback showed that mislabelling recalls were not associated with packaging processes with on-line bar code readers. This has led to the relaxation of the need for full component reconciliation, but indirectly upgraded the technical demands on validating bar code readers. [Pg.101]

To put this into context, when bar code readers were just one part of an integrated quality system (e.g. supplier count verification, QC count checks on incoming deliveries, line receipt checks, line cleandown checks and reconciliation), only the function of the bar code readers needed to be the focus of attention. As a stand-alone system, full validation/qualification becomes a necessity and takes the issue away from the line and onto technical support (i.e. QA). Just as with the earlier vision inspection equipment, qualification is not just of the bar code reader and the detection performance, but also of the component quality, e.g. [Pg.101]


See other pages where Validation reconciliation is mentioned: [Pg.638]    [Pg.2301]    [Pg.2551]    [Pg.638]    [Pg.2301]    [Pg.2551]    [Pg.2548]    [Pg.514]    [Pg.202]    [Pg.197]    [Pg.26]    [Pg.394]    [Pg.152]    [Pg.340]    [Pg.2302]    [Pg.173]    [Pg.718]    [Pg.216]    [Pg.356]    [Pg.439]    [Pg.319]    [Pg.321]    [Pg.183]    [Pg.2550]    [Pg.2552]   
See also in sourсe #XX -- [ Pg.124 , Pg.126 ]




SEARCH



© 2024 chempedia.info