Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Data manipulations

During the filtration process cake progressively grows upwards from the filter medium surface, liquid is expelled from the press and the piston moves downwards in the direction of the medium. When the particles become sufficiently networked (which can be from the very start of the test), or enough cake has formed for the piston to touch the top of the cake, further deliquor-ing is achieved by consolidation where liquid is expelled as the porosity of the cake decreases. Consolidation continues until an equilibrium state is reached such that the particle structure/cake is sufficiently strong to withstand any tendency to compress under the applied piston loading. [Pg.180]

Determination of the transition between filtration and consolidation is critical to the successful analysis of data and this is associated with a unique thickness of the solid/liquid mixture, L,, which can be calculated from [Pg.180]

The value of be easily measured during a normal expression test [Pg.180]

The volume of the solid/liquid mixture in the press at the start of the test ( Vq) and cOq are given by [Pg.181]

Whichever data sequence is recorded, the average solids volume fraction, (VX ratio of mass of mixture to mass of dry solids, (may), average porosity, (8 ), and average moisture content, in the press at any time are given by  [Pg.182]


O. VAILHEN, E. FLEUET, B. NOUAILHAS, A. SCHUMM TRAPPIST a standard open environment for multi-modal NDT data manipulation - Proceedings, 6th. ECND f, Nice, Oct. 94, pp. 45-49. [Pg.928]

Apart from the actual acquisition of the mass spectrum and its subsequent display or printout, the raw mass spectral data can be processed in other ways, many of which have been touched on in other chapters in thi.s book. Some of the more important aspects of this sort of data manipulation are explained in greater detail below. [Pg.322]

Continued advances in analytical instmmentation have resulted in improvements in characterization and quantification of chemical species. Many of these advances have resulted from the incorporation of computet technology to provide increased capabiUties in data manipulation and allow for more sophisticated control of instmmental components and experimentation. The development of rniniaturized electronic components built from nondestmctible materials has also played a role as has the advent of new detection devices such as sensors (qv). Analytical instmmentation capabiUties, especially within complex mixtures, are expected to continue to grow into the twenty-first century. [Pg.396]

Finally, it cannot be overemphasized that despite instmmental measurements and data manipulations, it is the perception of the eye that still is the final arbiter as to whether or to what degree two colors match. Instmmental methods do serve well for the typical industrial task of maintaining consistency under sufficiently weU-standardized conditions however, a specific technique may not serve in extreme or unusual conditions for which it was not designed. [Pg.416]

The graphics capabiUties of the CAD/CAM environment offer a number of opportunities for data manipulation, pattern recognition, and image creation. The direct appHcation of computer graphics to the automation of graphic solution techniques, such as a McCabe-Thiele binary distillation method, or to the preparation of data plots are obvious examples. Graphic simulation has been appHed to the optimisation of chemical process systems as a technique for energy analysis (84). [Pg.64]

Has the capacity to store and manage MSDS and chemical inventory form data Data manipulation including cross indexing lists to identify all facilities using a particular chemical. [Pg.278]

Across the top there is a menu bar with the usual Windows-type pull-down menus arranged from left to right in the order Files, Data Selection, Data Manipulation, Extras/Options, Output, or similar. Those options that are allowed or make sense in a given context are activated. Requests for numerical input make use of the standard Windows-type gray box with the question that is to be answered, the white area into which the data is written, and the appropriate confirmatory Yes/No/Cancel buttons. [Pg.362]

It should be stressed here that feature selection is not only a data manipulation operation, but may have economic consequences. For instance, one could decide on the basis of the results deseribed above to reduce the number of different tests for a EU/HYPO discrimination problem to only two. A less straightforward problem with which the decision maker is eonfronted is to decide how many tests to earry out for a EU/HYPER discrimination. One loses some 3% in seleetivity by eliminating one test. The deeision maker must then compare the economic benefit of earrying out one test less with the loss contained in a somewhat smaller diagnostic success. In fact, he earries out aeost-benefit analysis. This is only one of the many instanees where an analytical (or clinical) chemist may be confronted with such a situation. [Pg.237]

Demonstration of the integrity and performance of a complete analytical system, from initial sampling to data manipulation... [Pg.237]

Despite the complexity of the experiments and the enormous data manipulation necessary, complex biological pathways, as well as new drug targets are being identified by this method. Examples include screens for compounds that arrest cells in mitosis, that block cell migration, and that block the secretory pathway [50], or assays with primary T cells from PLP TCR transgenic mice for their inhibitory activity on the proliferation and secretion of proinflammatory cytokines in PLP-reactive T cells [51], and identification of small-molecule inhibitors of histone acetyltransferase activity [52]. [Pg.49]

Categorical Data and Why Zero and Missing Results Differ Greatly 102 Performing Many-to-Many Comparisons/Joins 106 Using Medical Dictionaries 108 Other Tricks and Traps in Data Manipulation 112 Common Analysis Data Sets 118 Critical Variables Data Set 118 Change-from-Baseline Data Set 118 Time-to-Event Data Set 121... [Pg.83]

For medium and large networks, the occurrence matrix that is of the same structure (isomorphic) as the coefficient matrix of the governing equations is usually quite sparse. For example, Stoner (S5) showed a 155-vertex network with a density of 3.2% for the occurrence matrix (i.e., 775 nonzeros out of a total of 1552 entries) using formulation C. Still lower densities have been observed on larger networks. In these applications it is of paramount importance that the data structure and data manipulations take full advantage of the sparsity of the governing equations. Sparse computation techniques are also needed in order to capture the full benefit of cycle selection and row and column reordering. [Pg.166]

We offer these two examples of data storage scheme in order to illustrate the interrelationship between data structure, storage requirement, and the types of operations to be performed. The specific data structure and data manipulation techniques to be used should always be tailored to the structure of the matrix and the requirement of the application. In point of fact both schemes I and II can be modified to overcome some of the stated deficiencies. Gustavson (G9) discussed modifications of scheme I to permit both row- and column-oriented operations and to accommodate fill-ins ... [Pg.167]

Fourier transformation without data manipulation leads to the multiplet at the bottom (a), which shows more fine structure when a negative LB value is used (b). The spectrum in the middle (c) results from use of the SSB function, and now all eight lines are clearly visible as the linewidth is much smaller. The price we pay is that the lineshape is completely changed, the positive central... [Pg.8]

In the data manipulation module, input data sourced from several databases and from the hydrology module are transformed into appropriate geographical information system (GIS) formats [61]. Before that, the hydrology module combines several hydrological databases with a hydrological model, providing to the... [Pg.56]

Modeling offers a spreadsheet-like capability, which permits the integration of several different spreadsheets and the inclusion of general data manipulation commands in cells. [Pg.25]

In the next section we briefly compare recursion augmented schemes with program schemes augmented by other data manipulation mechanisms - pushdown stores, labels and arrays. [Pg.268]

These initial experiments show that results can be obtained from this system that are comparable to those from the continuous flow reactor. The analytical system satisfies the requirements for accurate and rapid repetitive analysis. Scanning of 12 masses is possible at rates of approximately 100 ms/scan with good results. Further data manipulations are expected to yield additional results from this type of experiments. [Pg.252]

The different dialects of XML (XHTML, KML) are constrained by XML schemas (W3C, 2004). These schemas are critical to the success of XML. They are used to ensure that an XML file adheres to a well-defined structure. Schemas are themselves XML files, which must conform to the XSD specification. Schema designers are free to develop constraints to varying degrees. Forcing an XML file to be compatible with a tightly-constrained schema frees developers from having to write their own data validation procedures. This leads to a great simplification of data manipulation software. [Pg.391]

Through the use of an RS 232C interface, both instruments may be connected to Perkin-Elmer computers for instrument control and external data manipulation. [Pg.29]

Network (including communication links). Legacy systems hardware and software have very limited security capabilities, and the vulnerabilities of contemporary systems (based on modern information technology) are publicized. Wireless and shared links are susceptible to eavesdropping and data manipulation. [Pg.123]

Spector, P. Data Manipulation with R. Springer, New York, 2008. [Pg.326]

A principal components multivariate statistical approach (SIMCA) was evaluated and applied to interpretation of isomer specific analysis of polychlorinated biphenyls (PCBs) using both a microcomputer and a main frame computer. Capillary column gas chromatography was employed for separation and detection of 69 individual PCB isomers. Computer programs were written in AMSII MUMPS to provide a laboratory data base for data manipulation. This data base greatly assisted the analysts in calculating isomer concentrations and data management. Applications of SIMCA for quality control, classification, and estimation of the composition of multi-Aroclor mixtures are described for characterization and study of complex environmental residues. [Pg.195]


See other pages where Data manipulations is mentioned: [Pg.696]    [Pg.696]    [Pg.394]    [Pg.396]    [Pg.1826]    [Pg.522]    [Pg.160]    [Pg.245]    [Pg.469]    [Pg.4]    [Pg.63]    [Pg.1075]    [Pg.293]    [Pg.337]    [Pg.112]    [Pg.292]    [Pg.106]    [Pg.166]    [Pg.62]    [Pg.253]    [Pg.7]    [Pg.339]    [Pg.35]    [Pg.375]   


SEARCH



Data Manipulation After the Fourier Transform

Data Manipulation Before the Fourier Transform

Data management and manipulation

Data manipulation, molecular

Detectors data manipulation

Expression data manipulation

Field data, manipulation

Manipulation of data

Manipulations and NONMEM Data Base Creation

Optical data manipulations

Other Tricks and Traps in Data Manipulation

Scan data, manipulation

Spectral Data Manipulation

© 2024 chempedia.info