Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Obtaining data fundamental concepts

As far as quantitative chemical derivatization GC analysis is concerned, it is necessary to mention especially the work of Gehrke and his collaborators, who specified the fundamental concepts of quantitative GC analysis combined with the chemical derivatization of sample compounds and applied them to the accurate determination of the twenty natural protein amino acids and other non-protein amino acids as their N-TFA-n-butyl esters [5 ], the urinary excretion level of methylated nucleic acid bases as their TMS derivatives [6], TMS nucleosides [7] and other investigations. Further examples include a computer program for processing the quantitative GC data obtained for seventeen triglyceride fatty acids after their transesterification by 2 NKOH in n-butanol [8], a study of the kinetics of the transesterification reactions of dimethyl terephthalate with ethylene glycol [9] and the GC-MS determination of chlorophenols in spent bleach liquors after isolation of the chlorophenols by a multi-step extraction, purification of the final extract by HPLC and derivatization with diazoethane [10]. [Pg.26]

The counterflow configuration has been extensively utilized to provide benchmark experimental data for the study of stretched flame phenomena and the modeling of turbulent flames through the concept of laminar flamelets. Global flame properties of a fuel/oxidizer mixture obtained using this configuration, such as laminar flame speed and extinction stretch rate, have also been widely used as target responses for the development, validation, and optimization of a detailed reaction mechanism. In particular, extinction stretch rate represents a kinetics-affected phenomenon and characterizes the interaction between a characteristic flame time and a characteristic flow time. Furthermore, the study of extinction phenomena is of fundamental and practical importance in the field of combustion, and is closely related to the areas of safety, fire suppression, and control of combustion processes. [Pg.118]

There are two (count them two) more very critical developments that come from this partitioning of sums of squares. First, the correlation coefficient is not just an arbitrarily chosen computation (or even concept), but as we have seen bears a close and fundamental relationship to the whole ANOVA concept, which is itself a very fundamental statistical operation that data is subject to. As we have seen here, all these quantities - standard deviation, correlation coefficient, and the whole process of decomposing a set of data into its component parts - are very closely related to each other, because they all represent various outcomes obtained from the fundamental process of partitioning the sums of squares. [Pg.479]

Based on the fundamental dipole moment concepts of mesomeric moment and interaction moment, models to explain the enhanced optical nonlinearities of polarized conjugated molecules have been devised. The equivalent internal field (EIF) model of Oudar and Chemla relates the j8 of a molecule to an equivalent electric field ER due to substituent R which biases the hyperpolarizabilities (28). In the case of donor-acceptor systems anomalously large nonlinearities result as a consequence of contributions from intramolecular charge-transfer interaction (related to /xjnt) and expressions to quantify this contribution have been obtained (29). Related treatments dealing with this problem have appeared one due to Levine and Bethea bearing directly on the EIF model (30), another due to Levine using spectroscopically derived substituent perturbations rather than dipole moment based data (31.) and yet another more empirical treatment by Dulcic and Sauteret involving reinforcement of substituent effects (32). [Pg.64]

In recent years, two different approaches, deterministic [9,19] and stochastic [10,20], have been used with a good level of success to model the radiation chemistry of water. Each approach leads to reasonable agreement between calculated results and experimental data obtained for a wide range of LET from room temperature up toca. 300°C [9,10]. There are, however, fundamental differences between the two models. The deterministic model is based on the concept of an average spur [8,9,19,23] at the end of the physicochemical stage (ca. 10 sec), which contains the products of processes (I), (II), (III), (IV), and (V) in certain yields and spatial distributions, and in thermal equilibrium with the liquid. For low LET... [Pg.335]

The waste classification system should be developed in recognition of the types of information that are available and likely to be obtainable, and it should be specified to maximize compatibility with available information consistent with maintaining the fundamental integrity of the system. Establishment of a risk-based waste classification system must begin with the existing classification systems and associated databases (e.g., toxicity of hazardous substances). These would be expanded and refined as needed. However, if the foundations of a risk-based waste classification system or its implementation involve radically new concepts or call for data that cannot feasibly be obtained, the effort will be for naught. A realistic waste classification system must use the existing base of concepts and data to achieve the desired result. [Pg.254]

Figure 3. This kinetic model for zinc in humans was based on averaged data obtained following oral and i.v. administration of Zn to 17 patients with abnormalities of taste and smell. The compartmental model used all kinetic data from Zn activity in plasma, red blood cells, urine, liver, and thigh as well as stable zinc parameters, including dietary intake, serum, and urinary concentration. The SAAM27 computer program was used to obtain the simplest set of mathematical relationships that would satisfy the data characteristics for each measurement time in the study and remain consistent with accepted concepts of zinc metabolism. Although the short physical half-life of Zn limited the data collection period, this model allowed for analysis of the rapid phases of zinc metabolism (about 10% of total body zinc) and derivation of a number of fundamental steady state... Figure 3. This kinetic model for zinc in humans was based on averaged data obtained following oral and i.v. administration of Zn to 17 patients with abnormalities of taste and smell. The compartmental model used all kinetic data from Zn activity in plasma, red blood cells, urine, liver, and thigh as well as stable zinc parameters, including dietary intake, serum, and urinary concentration. The SAAM27 computer program was used to obtain the simplest set of mathematical relationships that would satisfy the data characteristics for each measurement time in the study and remain consistent with accepted concepts of zinc metabolism. Although the short physical half-life of Zn limited the data collection period, this model allowed for analysis of the rapid phases of zinc metabolism (about 10% of total body zinc) and derivation of a number of fundamental steady state...
SlMl.dat Section 1.4 Five data sets of 200 points each generated by SIM-GAUSS the deterministic time series sine wave, saw tooth, base line, GC-peak, and step function have stochastic (normally distributed) noise superimposed use with SMOOTH to test different filter functions (filer type, window). A comparison between the (residual) standard deviations obtained using SMOOTH respectively HISTO (or MSD) demonstrates that the straight application of the Mean/SD concept to a fundamentally unstable signal gives the wrong impression. [Pg.392]

In order to hold the book to a reasonable length, many of the quantitative derivations are not given in full. In all cases the reader is referred to the primary sources where the full development is available. It is taken for granted that the reader is familiar with the basics of the various fundamental disciplines involved or knows where they are explained. References are cited for descriptive details of apparatus or procedures not given in the text and also as sources of concepts, theories and experimental results. Much of the data in the literature of tribology was obtained when sophisticated equipment used in present day experimentation was not available. Many of the concepts at the heart of the modern view of tribology are not newcomers to traditional physics and chemistry. The authors often found the older data better suited to the... [Pg.644]

Another difficulty is caused by the application of Gibbs thermodynamic concept to nonequilibrium conditions of an adsorption layer, as it is done in many theoretical models (cf. Chapter 4). From non-equilibrium thermodynamics we learn (cf Eq. (2C.8) derived by Defay et al. (1966), Appendix 2C) that the diffusional transport causes an additional term to the Gibbs equation. However, this term seems to be negligible in many experiments. The experimental data discussed in Chapter 5, obtained from measurements in very different time windows, support the validity of Gibbs fundamental equation (2.33) also under non-equilibrium conditions. [Pg.52]


See other pages where Obtaining data fundamental concepts is mentioned: [Pg.71]    [Pg.262]    [Pg.97]    [Pg.22]    [Pg.3]    [Pg.48]    [Pg.260]    [Pg.92]    [Pg.186]    [Pg.409]    [Pg.185]    [Pg.297]    [Pg.14]    [Pg.709]    [Pg.137]    [Pg.154]    [Pg.392]    [Pg.18]    [Pg.234]    [Pg.525]    [Pg.303]    [Pg.244]    [Pg.189]    [Pg.232]    [Pg.693]    [Pg.10]    [Pg.349]    [Pg.346]    [Pg.115]    [Pg.154]    [Pg.645]    [Pg.191]    [Pg.101]    [Pg.44]    [Pg.169]    [Pg.226]    [Pg.24]    [Pg.8]    [Pg.197]   
See also in sourсe #XX -- [ Pg.56 , Pg.57 ]




SEARCH



Fundamental concepts

Fundamental data

Obtaining data

© 2024 chempedia.info