Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Distributed data technique

Assuming that the limitations imposed by the replicated-data approach on the maximum size calculation have been removed by the adoption of suitable distributed-data techniques, it is clear that a major bottleneck of direct... [Pg.251]

A procedure for proplnts is presented by J.W. French (Ref 27), who used both OM and EM (electron microscope) to study plastisol NC curing. He found that the cure time of plastisol NC is a logarithmic function of temp, and direct functions of chemical compn and total available surface area, as well as of particle size distribution. It should be noted that extensive use of statistics is required as a time-saving means of interpreting particle size distribution data. The current state-of-the-art utilizes computer techniques to perform this function, and in addition, to obtain crystal morphology data (Ref 62)... [Pg.144]

A general method has been developed for the estimation of model parameters from experimental observations when the model relating the parameters and input variables to the output responses is a Monte Carlo simulation. The method provides point estimates as well as joint probability regions of the parameters. In comparison to methods based on analytical models, this approach can prove to be more flexible and gives the investigator a more quantitative insight into the effects of parameter values on the model. The parameter estimation technique has been applied to three examples in polymer science, all of which concern sequence distributions in polymer chains. The first is the estimation of binary reactivity ratios for the terminal or Mayo-Lewis copolymerization model from both composition and sequence distribution data. Next a procedure for discriminating between the penultimate and the terminal copolymerization models on the basis of sequence distribution data is described. Finally, the estimation of a parameter required to model the epimerization of isotactic polystyrene is discussed. [Pg.282]

In this section three applications of the parameter estimation technique to problems in polymer science involving sequence distribution data are described. These problems are of varying degrees of difficulty and each serves to point out different aspects of the method. [Pg.283]

We have presented applications of a parameter estimation technique based on Monte Carlo simulation to problems in polymer science involving sequence distribution data. In comparison to approaches involving analytic functions, Monte Carlo simulation often leads to a simpler solution of a model particularly when the process being modelled involves a prominent stochastic coit onent. [Pg.293]

A. Thielemans, P.J. Lewi and D.L. Massart, Similarities and differences among multivariate display techniques by Belgian Cancer Mortality Distribution data. Chemom. Intell. Lab. Syst., 3 (1988) 277-300. [Pg.206]

An autoregressive time series model (16) seems to be less suitable for cumulative distribution data. This technique is primarily designed for finding trends and/or cycles for data recorded in a time sequence, under the null-hypothesis that the sequence has no effect. [Pg.275]

Tavare and Garside ( ) developed a method to employ the time evolution of the CSD in a seeded isothermal batch crystallizer to estimate both growth and nucleation kinetics. In this method, a distinction is made between the seed (S) crystals and those which have nucleated (N crystals). The moment transformation of the population balance model is used to represent the N crystals. A supersaturation balance is written in terms of both the N and S crystals. Experimental size distribution data is used along with a parameter estimation technique to obtain the kinetic constants. The parameter estimation involves a Laplace transform of the experimentally determined size distribution data followed a linear least square analysis. Depending on the form of the nucleation equation employed four, six or eight parameters will be estimated. A nonlinear method of parameter estimation employing desupersaturation curve data has been developed by Witkowki et al (S5). [Pg.10]

Nonparametric technique A statistical technique that does not depend for its validity upon the assumption that the data were drawn from a specific distribution, such as the normal or lognormal. A distribution-free technique. [Pg.181]

Optical Methods. Optical methods, based on the scattering of light by dispersed droplets, provide a relatively simple and rapid measure of particle size. However, optical techniques give data concerning the average drop size or the predominant size only, and size-distribution data cannot be obtained. Optical methods are more suited to the size analysis of aerosols and extremely fine mists than to the analysis of typical fuel sprays. [Pg.160]

The fundamental assumption of the WA technique is that the weighted average of a taxon represents the conditions for which this taxon is most abundant (Figure 8 shows typical distribution data). This optimum condition (see Figure 5) for each taxon can be calculated as the average of mean values for the environmental characteristics (e.g., water chemistry) at the sites in which it is found, weighted by the abundance of the taxon at the sites (Figure 9), namely... [Pg.23]

Fig. 10.34 The illustration of the technique to change the strain distribution data to the equivalent temperature distribution data. Fig. 10.34 The illustration of the technique to change the strain distribution data to the equivalent temperature distribution data.
Typical pore size distribution data are shown in Fig. 5, where the integral penetration of mercury into the pores is plotted as a function of applied pressure. The calculated pore diameters in angstroms are shown across the top. The integral curve clearly shows a bimodal pore distribution with mean pore diameters at 20,000 and 50 A. The latter is at the lower limit of the technique. A nitrogen desorption isotherm is required to obtain an accurate measure in the region below 100 A. [Pg.108]

Measurement time varied from under 1 minute for narrow distributions to several minutes for broad distributions. Data analysis times varied from seconds for narrow distributions to a few minutes for broad distributions. All measurements were analyzed using the technique of cumulants (8) and, for broad distributions, a modified version of exponential sampling theory (EST) (9). [Pg.49]

The mass versus particle size distribution of several polymer latices with diameters in the range of 30 nm to 1500 nm was determined in less than 20 minutes using an integrated hydrodynamic chromatograph. Distributions obtained were compared with those found by other particle sizing techniques such as electron microscopy to verify validity of the technique. The instrument employed was able to analyze latices re-producibly with different optical properties, even though some of the injected particles may have been trapped within the column. Latex properties were correlated with particle size distribution data to illustrate the benefit of this particle sizing technique. [Pg.256]

RBDOPT is coded in an object oriented environment using (C++) and uses distributed computing techniques to speed up the calculations. The thermophysical properties are estimated using Physical Property Data Service (PPDS). PPDS supports over 900 components and 36 physical property routes including NRTL, SRK, UNIFAC and UNIQUAC. The general structure of RBDOPT is shown in Figure 9.13. RBDOPT defines a batch distillation column as an object. Procedures... [Pg.289]

Experimental techniques commonly used to measure pore size distribution, such as mercury porosimetry or BET analysis (Gregg and Sing, 1982), yield pore size distribution data that are not uniquely related to the pore space morphology. They are generated by interpreting mercury intrusion-extrusion or sorption hysteresis curves on the basis of an equivalent cylindrical pore assumption. To make direct comparison with digitally reconstructed porous media possible, morphology characterization methods based on simulated mercury porosimetry or simulated capillary condensation (Stepanek et al., 1999) should be used. [Pg.145]

For now, the bimodal distribution may be an artifact. The two lifetimes can be considered as lower and upper bounds of the pore size distribution. This technique is the only one available that can provide non-destructive depth profiles without sample preparations other than mounting them in the vacuum system. Depth profiled lifetime data are currently being collected. This is practical due to the high data acquisition rate of 3TO3 to 104 lifetime events per second, depending on the implantation depth. [Pg.201]

The molecular size distributions and the size-distribution profiles for the nickel-, vanadium-, and sulfur-containing molecules in the asphaltenes and maltenes from six petroleum residua were determined using analytical and preparative scale gel permeation chromatography (GPC). The size distribution data were useful in understanding several aspects of residuum processing. A comparison of the molecular size distributions to the pore-size distribution of a small-pore desulfurization catalyst showed the importance of the catalyst pore size in efficient residuum desulfurization. In addition, differences between size distributions of the sulfur- and metal-containing molecules for the residua examined helped to explain reported variations in demetallation and desulfurization selectivities. Finally, the GPC technique also was used to monitor effects of both thermal and catalytic processing on the asphaltene size distributions. [Pg.139]

An alternative is the use of an optical method to measure particulate concentrations and size distributions. This technique has the obvious advantage of having a negligible effect on the particulates since the equipment would be external to the exhaust system. An optical method also has the potential to be much simpler to use since it would eliminate the need for elaborate and cumbersome systems containing probes, stack samplers, flow development tunnels, filters, and heat exchangers. In addition, final data from an optical system could be immediately obtained electronically as opposed to weighing the various filters in a particle impactor by hand, and as such, the optical analyzer is a real time instrument capable of following exhaust gas fluctuations and other nonsteady effects. [Pg.200]


See other pages where Distributed data technique is mentioned: [Pg.413]    [Pg.163]    [Pg.398]    [Pg.47]    [Pg.178]    [Pg.17]    [Pg.508]    [Pg.512]    [Pg.520]    [Pg.26]    [Pg.92]    [Pg.1210]    [Pg.202]    [Pg.340]    [Pg.129]    [Pg.134]    [Pg.275]    [Pg.273]    [Pg.7]    [Pg.142]    [Pg.297]    [Pg.99]    [Pg.357]    [Pg.136]    [Pg.240]    [Pg.242]    [Pg.254]    [Pg.277]    [Pg.283]    [Pg.2173]   
See also in sourсe #XX -- [ Pg.242 , Pg.251 , Pg.255 , Pg.283 ]




SEARCH



Data distribution

© 2024 chempedia.info