Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Data analysis, bottleneck

In Chapter 43 the incorporation of expertise and experience in data analysis by means of expert systems is described. The knowledge acquisition bottleneck and the brittleness of domain expertise are, however, the major drawbacks in the development of expert systems. This has stimulated research on alternative techniques. Artificial neural networks (ANN) were first developed as a model of the human brain structure. The computerized version turned out to be suitable for performing tasks that are considered to be difficult to solve by classical techniques. [Pg.649]

A rather crude, but nevertheless efficient and successful, approach is the bond fluctuation model with potentials constructed from atomistic input (Sect. 5). Despite the lattice structure, it has been demonstrated that a rather reasonable description of many static and dynamic properties of dense polymer melts (polyethylene, polycarbonate) can be obtained. If the effective potentials are known, the implementation of the simulation method is rather straightforward, and also the simulation data analysis presents no particular problems. Indeed, a wealth of results has already been obtained, as briefly reviewed in this section. However, even this conceptually rather simple approach of coarse-graining (which historically was also the first to be tried out among the methods described in this article) suffers from severe bottlenecks - the construction of the effective potential is neither unique nor easy, and still suffers from the important defect that it lacks an intermolecular part, thus allowing only simulations at a given constant density. [Pg.153]

In 1994, when the bottleneck of scattering data analysis was still the poor performance of detectors, Rudolph Landes were already spotting the bottleneck of our days ... [Pg.47]

As vitally important as the capabilities for experimental planning, screening, and data analysis are the procedures for preparation of inorganic catalysts. In contrast to the procedures usually applied in conventional catalyst synthesis, the synthetic techniques have to be adapted to the number of catalysts required in the screening process. Catalyst production can become a bottleneck and it is therefore necessary to ensure that HTE- and CombiChem-capable synthesis technologies are applied to ensure a seamless workflow. [Pg.385]

Nowadays, MS is often no longer the analytical bottleneck, but rather what precedes it (sample preparation) and follows it (data handling, searching). Direct mass-spectrometric methods have to compete with the separation techniques such as GC, HPLC and SFC that are commonly used for quantitative analysis of polymer additives. Extract analysis has the general advantage that higher-molecular-weight (less-volatile) additives can be detected more readily than by direct analysis of the polymer compound. [Pg.350]

As shown in Figure 7.2, most assays involve a common series of steps that must be completed in order to report results. These steps include sample receipt, method development, sample preparation, analysis, data processing, and data reporting. While most researchers focus on speeding the analysis step, any of these steps can become bottlenecks. Thus it is important to optimize the whole process. [Pg.207]

High-throughput laboratories have turned to assay automation, N-in-one (sample pooling) analysis strategies, and elaborate set-ups for parallel chromatography30 33 to increase capacity and decrease turn-around time. Despite the relatively fast speed of HPLC/MS, this step still creates a bottleneck in ADME work flow. Xu et al.32 reported a fast method for microsomal sample analysis that yields 231 data points per hour using a complex eight-column HPLC/MS set-up. [Pg.237]

The bottleneck in utilizing Raman shifted rapidly from data acquisition to data interpretation. Visual differentiation works well when polymorph spectra are dramatically different or when reference samples are available for comparison, but is poorly suited for automation, for spectrally similar polymorphs, or when the form was previously unknown [231]. Spectral match techniques, such as are used in spectral libraries, help with automation, but can have trouble when the reference library is too small. Easily automated clustering techniques, such as hierarchical cluster analysis (HCA) or PCA, group similar spectra and provide information on the degree of similarity within each group [223,230]. The techniques operate best on large data sets. As an alternative, researchers at Pfizer tested several different analysis of variance (ANOVA) techniques, along with descriptive statistics, to identify different polymorphs from measurements of Raman... [Pg.225]

The rapid growth in LC/MS productivity resulted in the production of massive amounts of data. Thus, with the increased productivity experienced with modern analysis systems, the bottleneck quickly shifted to data interpretation and management. Approaches that feature the visualization of data help to provide meaningful information for decision making. [Pg.58]

There are several other factors that are important when it comes to the selection of equipment in a measurement process. These parameters are items 7 to 13 in Table 1.2. They may be more relevant in sample preparation than in analysis. As mentioned before, very often the bottleneck is the sample preparation rather than the analysis. The former tends to be slower consequently, both measurement speed and sample throughput are determined by the discrete steps within the sample preparation. Modern analytical instruments tend to have a high degree of automation in terms of autoinjectors, autosamplers, and automated control/data acquisition. On the other hand, many sample preparation methods continue to be labor-intensive, requiring manual intervention. This prolongs analysis time and introduces random/systematic errors. [Pg.15]

To avoid bottlenecks in the analysis, each step should be automated. Automation of data acquisition is achieved by automated HPLC/UV/MS/ ELSD instruments, equipped with autosamplers that allow compounds to be batch analyzed in unattended fashion. Recently, fast generic HPLC methods have been implemented for analysis of combinatorial libraries [33 53]. [Pg.252]

The major bottleneck created by these high-throughput NMR techniques is in the analysis of the vast amount of data that is generated. A number of commercial packages are now available that use chemical shift/structure databases to aid in the interpretation of the spectra. However, fully automated spectral analysis systems are still under development. [Pg.124]

Those data formed a basis for subsequent analysis of hazard sources, the bottlenecks and problem issues typical for the current situation with radiation-hazardous facilities in the North-Western Russia. [Pg.19]


See other pages where Data analysis, bottleneck is mentioned: [Pg.6]    [Pg.270]    [Pg.65]    [Pg.269]    [Pg.67]    [Pg.409]    [Pg.265]    [Pg.422]    [Pg.637]    [Pg.222]    [Pg.63]    [Pg.275]    [Pg.511]    [Pg.178]    [Pg.90]    [Pg.118]    [Pg.699]    [Pg.817]    [Pg.559]    [Pg.549]    [Pg.727]    [Pg.156]    [Pg.196]    [Pg.6]    [Pg.325]    [Pg.20]    [Pg.301]    [Pg.185]    [Pg.69]    [Pg.24]    [Pg.32]    [Pg.1]    [Pg.261]    [Pg.23]    [Pg.224]   
See also in sourсe #XX -- [ Pg.296 ]




SEARCH



Bottleneck analysis

Bottlenecks

© 2024 chempedia.info