Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Data collection and analysis

Data Collection and Analysis Pertinent to the PA s Development of Guidelines for Procurement of Highway Construction Products Containing Recovered Materials, EPA Contract 68-01-6014, Draft, Vol. 1, Issues and Technical Summary, Franklin Associates, Ltd., and Valley Forge Laboratory, Inc., July 6, 1981. [Pg.21]

Bioprocess Control An industrial fermenter is a fairly sophisticated device with control of temperature, aeration rate, and perhaps pH, concentration of dissolved oxygen, or some nutrient concentration. There has been a strong trend to automated data collection and analysis. Analog control is stiU very common, but when a computer is available for on-line data collec tion, it makes sense to use it for control as well. More elaborate measurements are performed with research bioreactors, but each new electrode or assay adds more work, additional costs, and potential headaches. Most of the functional relationships in biotechnology are nonlinear, but this may not hinder control when bioprocess operate over a narrow range of conditions. Furthermore, process control is far advanced beyond the days when the main tools for designing control systems were intended for linear systems. [Pg.2148]

CCPS G-56. 1998. Guidelines for Improving Plant Reliability through Data Collection and Analysis. American Institute of Chemical Engineers, Center for Chemical Process Safety, New York. [Pg.147]

Guidelines for Improving Plant Reliability through Equipment Data Collection and Analysis (1998)... [Pg.553]

This requirement is similar to that in clause 4.14.3 under Preventive action since the data collected for preventive action serves a similar purpose. In one case an analysis of company-level data serves to identify overall trends and predict potential failures that will affect achievement of the goals. In the preventive action case, the data serves to identify local and overall trends and predict potential failures that will affect achievement of specified requirements for the product, process, and quality system. It would be sensible to develop a data collection and analysis system that serves all levels in the organization, with criteria at each level for reporting data upwards as necessary. You should not treat this requirement separately from that for preventive action since the same data should be used. However, the explanation given in clause 4.1.5 of Operational performance does include some factors that may not be addressed in your preventive action procedures. [Pg.144]

If your company has an internal EHS audit group or function, consider enlisting its support, either to help you devise a monitoring protocol, and/or to assist with data collection and analysis. [Pg.178]

Libberton, G. P., A. Bendell, L. A. Walls, and A. G. Cannon. Some Problems Associated with the Collection of Data for Automatic Fire Detection Systems on a Large Industrial Site. Proc. I Mech. E. Conference on Data Collection and Analysis for Reliability Assessment. London, England, 1986. [Pg.236]

SOURCE S. M. Berman, et al. (1976). "Electrical Energy Consumption in California Data Collection and Analysis." Lawrence Berkeley Laboratory, UGID 3847 (for 1947-1975 data). Association of Home Appliance Manufacturers (for 1972 and 1978-1995 data). [Pg.77]

A high pressure gel permeation chromatograph (GPC) has been used to monitor the performance of the reactor. A novel aspect of the GPC is that, it too, has been put on-line to the process control computer and both data collection and analysis have been made automatic while giving the operator full interactive facilities. [Pg.253]

It is therefore easy to see why this current drug safety paradigm, with its lack of standards in data collection and analysis, hinders the analysis of adverse events. Without data standards in place, it is difficult to build practical, reusable tools for systematic safety analysis. With no standard tools, truly standardized analyses cannot occur. Reviewers may forget their initial analytical processes if they are not using standardized data and tools. Comprehensive reproducibility and auditability, therefore, become nearly impossible. In practice, the same data sets and analytical processes cannot be easily reused, even by the same reviewers who produced the original data sets and analyses. Not using standardized tools slows the real-time systematic analysis... [Pg.652]

We need to transition from quasi-computerized methods, in which the different elements of the analytical process are treated as discrete, paper report tasks, to a comprehensive informatics approach, in which the entire data collection and analysis is considered as a single reusable, extensible, auditable, and reproducible system. Informatics can be defined as the science of storing, manipulating, analyzing, and visualizing information using computer systems. [3]... [Pg.653]

Criteria 1) Relevance to human health endpoints. 2) Sensitivity to change in loadings. 3) Overall historical data quality. 4) Data collection infrastructure. 5) Feasibility of data collection and analysis. 6) Ability to adjust for confounding factors. 7) Understanding of linkages with rest of ecosystem. 8) Broad geographic distribution. 9) Well-known life history (for fauna). 10) Nonintrusive sampling. [Pg.198]

A draft RI report should be produced for review by the support agency and submitted to the Agency for Toxic Substances and Disease Registry (ATSDR) for its use in preparing a health assessment and also to serve as documentation of data collection and analysis in support of the FS. The draft RI report can be prepared any time between the completion of the baseline risk assessment and the completion of the draft FS. Therefore, the draft RI report should not delay the initiation or execution of the FS. [Pg.602]

It is interesting to trace the development of instrument automation over the relatively brief period of the past ten to fifteen years. Early in this period, a truly automated instrument was a rare and expensive item built around a costly dedicated minicomputer. Automated data collection and analysis from any instrument which was not automated at the factory was usually accomplished by digitizing the data and storing it on a transportable media such as paper tape. These data were then delivered and fed to a timeshare system of some sort on which the data reduction program ran and which printed a report and sometimes a plot of the data. Often a considerable time delay occured between the generation and the analysis of the data. The scientist was at the mercy of the computer elite who could implement his data logger and provide the necessary computer resources to analyze his data. The process was expensive, both in time and in money. [Pg.3]

There we have it, if data collection and analysis can not be done now, it is usually because someone doesn t want it to be done. Where then are the new horizons in laboratory automation We return to the concept of task automation. Task automation involves determining what it is we should be doing, and using automation to accomplish it efficiently. This is a restatement of the now familiar efficiency and effectiveness concept. [Pg.4]

Software for Data Collection and Analysis from a Size-Exclusion Liquid Chromatograph... [Pg.130]

Data collection and analysis. The first data collected on the... [Pg.162]

This efficient statistical test requires the minimum data collection and analysis for the comparison of two methods. The experimental design for data collection has been shown graphically in Chapter 35 (Figure 35-2), with the numerical data for this test given in Table 38-1. Two methods are used to analyze two different samples, with approximately five replicate measurements per sample as shown graphically in the previously mentioned figure. [Pg.187]

The 5950A ESCA spectrometer is interfaced to a desktop computer for data collection and analysis. Six hundred watt monochromatic A1 Ka X-rays are used to excite the photoelectrons and an electron gun set at 2 eV and 0.3 mAmp is used to reduce sample charging. Peak areas are numerically integrated and then divided by the theoretical photoionization cross-sections (11) to obtain relative atomic compositions. For the supported catalyst samples, all binding energies (BE) are referenced to the A1 2p peak at 75.0 eV, the Si 2p peak at 103.0 eV, or the Ti 2p3/2 peak at 458.5 eV. [Pg.45]

The final section, on analytical chemistry, is a combination of structure-elucidation techniques and instrumental optimizations. Instrumental analysis can be broken into several steps method development, instrumental optimization, data collection, and data analysis. The trend today in analytical instrumentation is computerization. Data collection and analysis are the main reasons for this. The chapters in this section cover all aspects of the process except data collection. Organic structure elucidation is really an extension of data analysis. These packages use spectroscopic data to determine what structural fragments are present and then try to determine... [Pg.403]

A Waters Model 150C ALC/GPC was interfaced to a minicomputer system by means of a microcomputer for automated data collection and analysis. Programs were developed for conventional molecular weight distribution analysis of the data and for liquid chromatographic quantitative composition analysis of oligomeric materials. Capability has been provided to utilize non-standard detectors such as a continuous viscometer detector and spectroscopic detectors for compositional analysis. The automation of the instrument has resulted in greater manpower efficiency and improved record keeping. [Pg.57]

The management shall motivate its members to undertake quality improvement projects or activities in a consistent and disciplined series of steps based on data collection and analysis. [Pg.123]

The research conducted in support of this chapter was financially supported by ONR/DARPA (N00014-98-1-0776) and NSF (IBN-0321444). Special thanks to S. Rahman and L.P. Dasi for data collection and analysis and to MJ. Weissburg and C.B. Woodson for helpful discussion. [Pg.127]

Now to the crux of the matter. With all the thousands of volumes on statistical methods, what rule of thumb do we use to decide what method fits a specific case A thought to keep in mind is that precise experimentation merits precise methodology decisions on which a large dollar sign is hung warrant comparable expenditures for data collection and analysis. Conversely, the simplest available methods should be used ... [Pg.66]

Known samples with stated values of the elements of interest should be circulated, along with the unknowns, for calibrating the methods used in the various laboratories. Standards of known composition should also be included with the unknowns. It is possible that standard archaeological bronzes should be synthesized for this program and for future circulation. Also, the statistical basis of the program should be reassessed and a new statistical design drawn up so that data collection and analysis can be simplified. Since many archaeologists and museums use the services of commercial laboratories, samples should be circulated to these laboratories for paid analysis. [Pg.191]


See other pages where Data collection and analysis is mentioned: [Pg.261]    [Pg.251]    [Pg.701]    [Pg.715]    [Pg.809]    [Pg.4]    [Pg.4]    [Pg.139]    [Pg.823]    [Pg.126]    [Pg.130]    [Pg.494]    [Pg.194]    [Pg.41]    [Pg.368]    [Pg.142]    [Pg.551]    [Pg.22]   


SEARCH



Data and analysis

Data collection

© 2024 chempedia.info