Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Data scaling methods

Before the final calibration model can be determined, the number of factors to retain must be ascertained. The common data scaling methods of mean centering... [Pg.41]

The production of the agrochemical 6 (Scheme 5.7) is carried out batchwise via a three-step protocol. Mass balancing has been conducted for three stages of development Laboratory-, pilot- and operation scale. An LCA was available for the operation stage only. A description of this LCA including data sources and data acquisition methods was published by Geisler et al. (product A in reference [9] corresponds to product 6 here). Many parameters in the Life-Cycle Inventory (LCI) are estimated, especially utihty demands and yields of processes for the production of precursors. Uncertainty in these estimations was illustrated in a... [Pg.215]

In order to extrapolate laboratory animal results to humans, an interspecies dose conversion must be performed. Animals such as rodents have different physical dimensions, rates of intake (ingestion or inhalation), and lifespans from humans, and therefore are expected to respond differently to a specified dose level of any chemical. Estimation of equivalent human doses is usually performed by scaling laboratory doses according to observable species differences. Unfortunately, detailed quantitative data on the comparative pharmacokinetics of animals and humans are nonexistent, so that scaling methods remain approximate. In carcinogenic risk extrapolation, it is commonly assumed that the rate of response for mammals is proportional to internal surface area... [Pg.299]

An alternative approach to the calculation of accurate thermochemical data is to scale the computed correlation energy with multiplicative parameters determined by fitting to the experimental data. Pioneering methods using such an approach include the scaling all correlation (SAC) method of Gordon and Truhlar [32], the parameterized correlation (PCI-X) method of Siegbahn et al. [33], and the multi-coefficient correlation methods (MCCM) of Truhlar et al. [34-36]. Such methods can be used... [Pg.77]

Data analysis methods depend upon the level of order in the sample. The degree of order, in turn, depends upon the scale of distance on which the sample is viewed. For example, casein micelles show great variation in size (20 to 300 nm diameter) and so must be treated as a polydisperse system. However, the density variations ( submicelles ) within the whole micelle are much more uniform in size. They can be treated as a quasi-monodisperse system (Stothart and Cebula, 1982) and analyzed in terms of inter-particle interference (Stothart, 1989). [Pg.207]

Figure 4.2 demonstrates the instanton trajectories at different temperatures for C = 0.5, fl = 0.5, n = 2. For temperatures close to the Tc the trajectory runs near the saddle point, and it deviates from the saddle point with increasing fi. The Hamiltonian (4.29) with n = 1 has recently been studied numerically within the complex scaling method [Hontscha et al. 1990]. Using these data we can estimate the accuracy of the instanton... [Pg.106]

One common microarray data normalization method is to calculate a normalization factor on a per array basis or across an entire experiment. The primary assumption for using a singular normalization factor is that the volume of labeled sample is comparable across the two channels. Thus, due to the large population of labeled cDNA within the uniform volume it is assumed that the same number of labeled cDNAs exist in both samples. Ideally, the overall intensity in the two channels will be the same. Furthermore, any increases in labeled cDNAs, due to increases in mRNA, must result in decreases of some other labeled cDNAs. Typical methods include mean- or median-centering, where the mean/median values are centered within the data distribution, and z-score normalization which adds a scaling factor to mean-centering. [Pg.539]

Vital et al. (131) present an extensive tabulation of tray efficiency data collected from the published literature, Data interpolation is one of the more reliable methods for obtaining tray efficiency, provided the data are good and the rules recommended for data scale-up (Secs. 7.3,6 and 7.3.7) are followed. [Pg.378]

Ordered scores on scales with a limited number of possible values are notorious for producing patterns that are wildly non-normal and we tend to avoid analysing such data using methods where a normal distribution is a pre-requisite. [Pg.31]

Standardisation is another common method for data scaling and occurs after mean centring each variable is also divided by its standard deviation ... [Pg.213]

There are three methods for data scaling, as in PCA, but the relevant column means and standard deviations are always obtained from the training set. If there is a test set, then the training set parameters will be used to scale the test set, so that the test set is unlikely to be mean centred or standardised. Similar scaling is performed on both the c and r block simultaneously. If you want to apply other forms of scaling (such as summing rows to a constant total), this can be performed manually in Excel and PCA... [Pg.453]


See other pages where Data scaling methods is mentioned: [Pg.42]    [Pg.42]    [Pg.2]    [Pg.419]    [Pg.66]    [Pg.110]    [Pg.55]    [Pg.299]    [Pg.303]    [Pg.8]    [Pg.359]    [Pg.371]    [Pg.347]    [Pg.477]    [Pg.13]    [Pg.39]    [Pg.245]    [Pg.538]    [Pg.381]    [Pg.145]    [Pg.20]    [Pg.412]    [Pg.2]    [Pg.133]    [Pg.175]    [Pg.181]    [Pg.251]    [Pg.177]    [Pg.140]    [Pg.110]    [Pg.176]    [Pg.456]    [Pg.285]    [Pg.8]    [Pg.152]    [Pg.258]    [Pg.3]    [Pg.251]    [Pg.659]    [Pg.362]    [Pg.362]   
See also in sourсe #XX -- [ Pg.138 ]




SEARCH



Data Method

Data scaling

Scale method

Scaling methods

© 2024 chempedia.info