Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Parameter analysis normalization

Although standard enthalpies of formation provide information about the net stability of molecules and their transformations, they do not always indicate stability of individual bonds. This analysis normally involves parameters, loosely called bond energies, that reflect the amount of energy required to cleave chemical bonds. [Pg.58]

The multivariate techniques which reveal underlying factors such as principal component factor analysis (PCA), soft Independent modeling of class analogy (SIMCA), partial least squares (PLS), and cluster analysis work optimally If each measurement or parameter Is normally distributed In the measurement space. Frequency histograms should be calculated to check the normality of the data to be analyzed. Skewed distributions are often observed In atmospheric studies due to the process of mixing of plumes with ambient air. [Pg.36]

Current methods for supervised pattern recognition are numerous. Typical linear methods are linear discriminant analysis (LDA) based on distance calculation, soft independent modeling of class analogy (SIMCA), which emphasizes similarities within a class, and PLS discriminant analysis (PLS-DA), which performs regression between spectra and class memberships. More advanced methods are based on nonlinear techniques, such as neural networks. Parametric versus nonparametric computations is a further distinction. In parametric techniques such as LDA, statistical parameters of normal sample distribution are used in the decision rules. Such restrictions do not influence nonparametric methods such as SIMCA, which perform more efficiently on NIR data collections. [Pg.398]

Based on their chemical structure, the organic chemicals were divided into a number of categories alkanes, alkenes, amines, aromatic hydrocarbons, benzenes, carboxylic acids, halides, phenols, and sulfonic acid. Linear regression analysis has been applied using the method of least-squares fit. Each correlation required at least three datapoints, and the parameters chosen were important to ensure comparable experimental conditions. Most vital parameters in normalizing oxidation rate constants for QSAR analysis are the overall liquid volume used in the treatment system, the source of UV light, reactor type, specific data on substrate concentration, temperature, and pH of the solution during the experiment. [Pg.270]

To illustrate more clearly the effect of these variables on analysis time, reduced parameters can be used for the plate height and velocity. Reduced parameters effectively normalize the plate height and velocity for the particle diameter and the diffusion coefficient to produce dimensionless parameters that allow comparison of different columns and separation conditions. The reduced plate height and reduced velocity are expressed, respectively, as... [Pg.772]

The analysis is carried out using the Drude equations this leads to a combination of the ellipsometric thickness and the refractive index Increment. These characteristics of the adsorbate cannot be unambiguously separated. Conversion of the refractive index increment into the composition of the adsorbate layer is usually done by assuming drt/dx to be the same as in a fluid of composition x for 0 not too high this is usually allowed, but problems may arise when the adsorbate differs substantially from the solution, for Instance because of alignment of adsorbed chain molecules. The result obtained is not unique, in the sense that different profiles may lead to the same pair of ellipsometric parameters. Therefore, normally totally adsorbed amounts are presented. For accurate measurements a good optical contrast between adsorbate and solution is mandatory. [Pg.203]

The intensity of the emission is directly proportional to the analyte concentration, provided that all instrumental parameters remain constant throughout the analysis. Normally, however, several factors may change slightly, reducing the accuracy of the measurement. Variations in the sample matrix or sample viscosity... [Pg.426]

On the base of mentioned inaccuracy of input data for probabilistic analysis of loss integrity of reinforced concrete containment structures were determined their mean values and standard deviations, the variable parameters for normal and lognormal distribution. Using the RSM simulation method the vector of the deformation parameters r is defined for simulation in the form... [Pg.1310]

This paper presents a new approach based on a combination of traditional predictive modelling and event/fault tree analysis techniques, which allows representing at the same time evolution of hazards and normal and abnormal (i.e. failures) performance of safety measures, e.g. variations of process parameters, analysis and inspections, through the food chain for a better estimation of the real impact of such deviations/failures on consumer health. [Pg.1746]

Global kinetics, however, allowed calculation of some parameters in normal volunteers, i.e., fractional catabolic rate, rate of synthesis, and mean residence time determined by the mathematical analysis of both plasma decay curves and urinary excretion rates. Such studies demonstrated different metaboUsms for Apo C-I, C-II, and C-III. They may supply essential information on the perturbations observed in pathology. [Pg.46]

Parameter-Driven Analysis Normal Conditions, Contingencies, Controls... [Pg.712]

Attrition can normally not be investigated directly in a large-scale process. It is, for example, impossible to analyze the entire bulk of material, and it is nearly impossible or at least very expensive to perform a parameter analysis in a running industrial process. For this reason, attrition has to be investigated in small-scale experiments. The results of these experiments require a model or at least an idea of the governing attrition mechanisms to be applied to the large-scale process. In principle, there are two different philosophies of attrition modeling ... [Pg.220]

The big step forward in this method is the realization that the flaw propagation analysis can be made to include the residual strength term discussed in Section 5.4 and, in so doing, the microcrack size is replaced by the easily determined parameter, the normal load on the indenter. [Pg.264]

Physical methods of analysis normally involve a measurement of a physical parameter other than mass or volume. For example, a water sample suspected of being polluted with hexavalent chromium can be injected into an inductively coupled plasma atomic emission spectrometer (ICP/AES) and the intensity of light given off by the very hot chromium atoms emitted by the sample measured to give the chromium concentration. Or fluoride in a water sample can be determined by measuring the potential versus a reference electrode of a fluoride ion-selective electrode immersed in the sample and comparing that value with the potential measured in a standard F" solution to give the value of [F"]. [Pg.512]

Then, there are notable differences in the reproducibility of measurements. At the HMDE, this parameter commonly is below 1 % rel., whereas similar experiments with a MFE vary in an interval of 3-5 % rel." " ° Of course, in this case, such values are still excellent if one considers typical levels of precision for measurements in trace analysis (normally, with recovery rates within 90-110 % as stated in numerous scientific papers). [Pg.88]

We can imagine measuring experimental curves equivalent to those in Fig. 9.11 by, say, scanning the length of the diffusion apparatus by some optical method for analysis after a known diffusion time. Such results are then interpreted by rewriting Eq. (9.85) in the form of the normal distribution function, P(z) dz. This is accomplished by defining a parameter z such that... [Pg.631]

The degree of data spread around the mean value may be quantified using the concept of standard deviation. O. If the distribution of data points for a certain parameter has a Gaussian or normal distribution, the probabiUty of normally distributed data that is within Fa of the mean value becomes 0.6826 or 68.26%. There is a 68.26% probabiUty of getting a certain parameter within X F a, where X is the mean value. In other words, the standard deviation, O, represents a distance from the mean value, in both positive and negative directions, so that the number of data points between X — a and X -H <7 is 68.26% of the total data points. Detailed descriptions on the statistical analysis using the Gaussian distribution can be found in standard statistics reference books (11). [Pg.489]

Statistical Criteria. Sensitivity analysis does not consider the probabiUty of various levels of uncertainty or the risk involved (28). In order to treat probabiUty, statistical measures are employed to characterize the probabiUty distributions. Because most distributions in profitabiUty analysis are not accurately known, the common assumption is that normal distributions are adequate. The distribution of a quantity then can be characterized by two parameters, the expected value and the variance. These usually have to be estimated from meager data. [Pg.451]

The above assumes that the measurement statistics are known. This is rarely the case. Typically a normal distribution is assumed for the plant and the measurements. Since these distributions are used in the analysis of the data, an incorrect assumption will lead to further bias in the resultant troubleshooting, model, and parameter estimation conclusions. [Pg.2561]

Gn L) is often difficult to determine for a given load distribution, but when is large, an approximation is given by the Maximum Extreme Value Type I distribution of the maximum extremes with a scale parameter, 0, and location parameter, v. When the initial loading stress distribution,/(L), is modelled by a Normal, Lognormal, 2-par-ameter Weibull or 3-parameter Weibull distribution, the extremal model parameters can be determined by the equations in Table 4.11. These equations include terms for the number of load applications, n. The extremal model for the loading stress can then be used in the SSI analysis to determine the reliability. [Pg.183]

Beeause the loading stress funetion as an output from the varianee analysis is eharaeterized by a Normal distribution, it is advantageous to estimate the parameters for a 3-parameter Weibull distribution, xo and 6, given the mean, /i, and standard deviation, cr, for a Normal distribution (assuming [3 = 3.44), where ... [Pg.375]


See other pages where Parameter analysis normalization is mentioned: [Pg.789]    [Pg.340]    [Pg.157]    [Pg.419]    [Pg.102]    [Pg.102]    [Pg.2363]    [Pg.396]    [Pg.1656]    [Pg.687]    [Pg.8772]    [Pg.362]    [Pg.90]    [Pg.214]    [Pg.322]    [Pg.180]    [Pg.94]    [Pg.228]    [Pg.729]    [Pg.350]    [Pg.37]    [Pg.2328]    [Pg.2547]    [Pg.578]    [Pg.159]    [Pg.159]    [Pg.161]    [Pg.250]   
See also in sourсe #XX -- [ Pg.695 , Pg.700 ]




SEARCH



Analysis parameters

Parameters, “normal

© 2024 chempedia.info