Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Performance data post processing

The optimum time step in a FEM-CVA simulation is the one that fills exactly one new control volume. Once the fill factors are updated, the simulation proceeds to solve for a new pressure and flow field, which is repeated until all fill factors are 1. While the FEM-CVA scheme does not know exactly, where the flow front lies, one can recover flow front information in post-processing quite accurately. One very common technique is for the simulation program to record the time when a node is half full, / = 0.5. This operation is performed when the nodal fill factors are updated if the node has fk <0.5 and fk+1 >0.5 then the time at which the fill factor was 0.5 is found by interpolating between tk and tk+l. These half-times are then treated as nodal data and the flow front or filling pattern at any time is drawn as a contour of the corresponding half-times, or isochronous curves. [Pg.495]

An important way to improve network performance is through the use of prior knowledge, which refers to information that one has about the desired form of the solution and which is additional to the information provided by the training data. Prior knowledge can be incorporated into the pre-processing and post-processing stages (Chapter 7), or into the network structure itself. [Pg.89]

With this in mind, it seems to be a reasonable compromise to consider a FEM implementation of the modelling of stress-assisted diffusion over the previously (or simultaneously) performed stress analysis taking the nodal values of stresses, obtained with a post-processing technique, as the entry data for diffusion, i.e., constructing a finite-element approximation of the stress field with the aid of the same finite-element shape functions used in the mechanical analysis to approximate the displacement fields. [Pg.135]

Dedicated data systems perform four functions (1) control all operational processes of both the mass spectrometer and integrated peripheral instruments, such as GC or LC systems (2) acquisition and processing of all data (3) local interpretation of acquired data and (4) post-processing of data, including interaction with databases (almost always via the Internet) (Figure 2.44). Connection to the Internet also enables the remote control of multiple systems as well as the off-site diagnosis of failures by instrument manufacturers. [Pg.108]

Autocorrelation (see Chapter 2) can be performed on raw time series data collected by MCS cards such as those described earlier, although there are a number of inefficiencies associated with post-processing autocorrelations (see Chapter 2), not least that long data acquisitions have to be performed and the analysis has to be carried out before one can tell if the experiment has been at all successfiil. Hardware digital correlators [46] (e.g. the ALV-5000 series, ALV GmbH, Germany) can take the digital output from APD modules and direcdy perform an auto- or cross-correlation and display the result in real time. The method of operation of one particular hardware correlator was covered in detail in Chapter 2 Section 2.4.2. [Pg.141]

Because of the limits of industrial equipment and cost constraints, curing is done at a constant temperature for a period of time. This can be done both to initially cure the material or to post-cure it. (The kinetic models discussed in the next section also require data collected imder isothermal conditions.) It is also how rubber samples are cross-linked, how initiated reactions are run, and how bulk polymerizations are performed. Industrially, continuous processes, as opposed to batch, often require an isothermal approach. UV light and other forms of nonthermal initiation also use isothermal studies for examining the cure at a constant temperature. [Pg.2307]

In hardware make sure that there is no voltage on the bunching parameter (perform the imaging measurements in unbunched mode for optimal image quality). Before the start of the experiment select under acquisition setup/advanced settings save as raw file, in order to be able to post-process the raw data after the measurement is completed. [Pg.204]

Thus, the process of model testing and validation (considered synonymous) should ideally include the steps of calibration (if necessary), verification, and post-audit analyses. I indicate "ideally" because in many applications existing data will not support performance of all steps. In chemical fate modeling, chemical data for verification is often lacking and post-audit analyses are rare (unfortunately) for any type of modeling exercise. [Pg.154]

With triple quadrupole MS, automatically obtaining sensitive full scan CID spectra can serve as a powerful tool in metabolite identification due to flexibility in performing post-acquisition data processing. [Pg.327]


See other pages where Performance data post processing is mentioned: [Pg.165]    [Pg.102]    [Pg.457]    [Pg.111]    [Pg.100]    [Pg.67]    [Pg.67]    [Pg.153]    [Pg.222]    [Pg.559]    [Pg.124]    [Pg.226]    [Pg.261]    [Pg.69]    [Pg.205]    [Pg.54]    [Pg.91]    [Pg.39]    [Pg.756]    [Pg.34]    [Pg.756]    [Pg.296]    [Pg.395]    [Pg.309]    [Pg.105]    [Pg.188]    [Pg.219]    [Pg.296]    [Pg.319]    [Pg.870]    [Pg.1741]    [Pg.203]    [Pg.67]    [Pg.140]    [Pg.149]    [Pg.714]    [Pg.119]    [Pg.303]    [Pg.371]    [Pg.16]    [Pg.102]    [Pg.1076]   
See also in sourсe #XX -- [ Pg.583 ]




SEARCH



Data processing

Performance Process

Performance data

Post-processing

Process data

Processing performance

© 2024 chempedia.info