Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Statistical transformation

The raw concentration data were subjected to several mathematical and statistical transformations. The ratio of the element of interest to Fe helps offset inherent variation in Fe across the data set A logio transform is a standard statistical conversion for elemental data. This transformation reduces the weighting effect from very small to very large concentrations in the data (48). More details on the calculations and reasoning behind these transforms can be found elsewhere (5). [Pg.492]

A number of statistical transformations have been proposed to quantify the distributions in pitting variables. Gumbel is given the credit for the original development of extreme value statistics (EVS) for the characterization of pit depth distribution [13]. The EVS procedure is to measure maximum pit depths on several replicate specimens that have pitted, then arrange the pit depth values in order of increasing rank. The Gumbel distribution expressed in Eq 1, where X and a are the location and scale parameters, respectively, can then be used to characterize the dataset and estimate the extreme pit depth that possibly can affect the system from which the data was initially produced. [Pg.94]

Mehra, J., and H. Rechenberg. 2000. From molecular theory to quantum chemistry. In The historical development of quantum theory. Vol. 6. The completion of quantum mechanics. 1926-1941. Part 1. The probability interpretation and the statistical transformation theory, the physical interpretation, and the empirical and mathematical foundations of quantum mechanics 1926-1932. New York Springer-Verlag. [Pg.315]

Note also that I have used the past-tense to describe this evaluation process. Now, my students punch numbers into high-tech computer programs to allow for multiple statistical transformations. Still, I recommend the earlier, straightforward, and low-tech approach over our current protocol. [Pg.436]

At first, it is statistical standard of the undefective section. Such standard is created, introducing certain lower threshold and using measured data. Under the classical variant of the shadow USD method it is measured fluctuations of accepted signal on the undefective product and installed in each of 512 direction its threshold in proportion to corresponding dispersions of US signal in all 128 sections. After introducting of threshold signal is transformed in the binary form. Thereby, adaptive threshold is one of the particularities of the offered USCT IT. [Pg.249]

A structure descriptor is a mathematical representation of a molecule resulting from a procedure transforming the structural information encoded within a symbolic representation of a molecule. This mathematical representation has to be invariant to the molecule s size and number of atoms, to allow model building with statistical methods and artificial neural networks. [Pg.403]

Molecules are usually represented as 2D formulas or 3D molecular models. WhOe the 3D coordinates of atoms in a molecule are sufficient to describe the spatial arrangement of atoms, they exhibit two major disadvantages as molecular descriptors they depend on the size of a molecule and they do not describe additional properties (e.g., atomic properties). The first feature is most important for computational analysis of data. Even a simple statistical function, e.g., a correlation, requires the information to be represented in equally sized vectors of a fixed dimension. The solution to this problem is a mathematical transformation of the Cartesian coordinates of a molecule into a vector of fixed length. The second point can... [Pg.515]

The raw data collected during the experiment are then analyzed. Frequently the data must be reduced or transformed to a more readily analyzable form. A statistical treatment of the data is used to evaluate the accuracy and precision of the analysis and to validate the procedure. These results are compared with the criteria established during the design of the experiment, and then the design is reconsidered, additional experimental trials are run, or a solution to the problem is proposed. When a solution is proposed, the results are subject to an external evaluation that may result in a new problem and the beginning of a new analytical cycle. [Pg.6]

A molten metal alloy would normally be expected to crystallize into one or several phases. To form an amorphous, ie, glassy metal alloy from the Hquid state means that the crystallization step must be avoided during solidification. This can be understood by considering a time—temperature—transformation (TTT) diagram (Eig. 2). Nucleating phases require an iacubation time to assemble atoms through a statistical process iato the correct crystal stmcture... [Pg.334]

The generation of photons obeys Poisson statistics where the variance is N and the deviation or noise is. The noise spectral density, N/, is obtained by a Fourier transform of the deviation yielding the following at sampling frequency,... [Pg.422]

Another consideration when using the approach is the assumption that stress and strength are statistically independent however, in practical applications it is to be expected that this is usually the case (Disney et al., 1968). The random variables in the design are assumed to be independent, linear and near-Normal to be used effectively in the variance equation. A high correlation of the random variables in some way, or the use of non-Normal distributions in the stress governing function are often sources of non-linearity and transformations methods should be considered. [Pg.191]

Metallurgists originally, and now materials scientists (as well as solid-state chemists) have used erystallographic methods, certainly, for the determination of the structures of intermetallic compounds, but also for such subsidiary parepistemes as the study of the orientation relationships involved in phase transformations, and the study of preferred orientations, alias texture (statistically preferential alignment of the crystal axes of the individual grains in a polycrystalline assembly) however, those who pursue such concerns are not members of the aristocracy The study of texture both by X-ray diffraction and by computer simulation has become a huge sub-subsidiary field, very recently marked by the publication of a major book (Kocks el al. 1998). [Pg.177]

Finally, before constructing 7i [p+ r), p- f)] we can note that we have introduced a field-theoretic approach on a heuristic basis where the fields have a clear physical meaning. For the point particle coulomb gas there is a rigorous transformation of the usual statistical mechanics to a field-theoretic formulation in which, however, the field has no apparent physical meaning (see, e.g., [23,24]). [Pg.808]

All data recorded in the data base have been acquired from plant records. Statistical reductions of data for generation of reports or specific end use are available. Data are currently collected from four operating plants (eight units). Time clocks have been installed on components, to record actual exposure time. Event data are available on a broad variety of safety and commercial grade components including pumps, valves, transformers, diesels, filters, tanks (vessels), and heat exchangers. [Pg.70]

In mathematics, Laplace s name is most often associated with the Laplace transform, a technique for solving differential equations. Laplace transforms are an often-used mathematical tool of engineers and scientists. In probability theory he invented many techniques for calculating the probabilities of events, and he applied them not only to the usual problems of games but also to problems of civic interest such as population statistics, mortality, and annuities, as well as testimony and verdicts. [Pg.702]

Many, possibly all, rules appear to generate asymptotic states which are block-related to configurations evolving according to one of only a small subset of the set of all rules, members of which are left invariant under all block transformations. That is, the infinite time behavior appears to be determined by evolution towards fixed point rule behavior, and the statistical properties of all CA rules can then, in principle, be determined directly from the appropriate block transformations necessary to reach a particular fixed point rule. [Pg.67]

These examples then suggest that any general and fundamental models of communication systems (at least for digital data) should emphasize the size of the alphabets concerned and the probabilities of these letters, and should be relatively unconcerned with other characteristics of the letters. An appropriate model for this purpose consists of a random process in place of the source, a transformation on the samples of the random process for the coder, a random process at the output of the channel depending statistically on the input to the channel, and a transformation in the decoder. We are, of course, interested in knowing what transformations to use in the coder and decoder to make the decoder output as faithful a replica of the source output as possible. [Pg.193]


See other pages where Statistical transformation is mentioned: [Pg.346]    [Pg.1161]    [Pg.3134]    [Pg.443]    [Pg.274]    [Pg.346]    [Pg.1161]    [Pg.3134]    [Pg.443]    [Pg.274]    [Pg.211]    [Pg.322]    [Pg.312]    [Pg.131]    [Pg.378]    [Pg.825]    [Pg.1571]    [Pg.139]    [Pg.233]    [Pg.102]    [Pg.159]    [Pg.506]    [Pg.109]    [Pg.168]    [Pg.383]    [Pg.93]    [Pg.101]    [Pg.238]    [Pg.782]    [Pg.941]    [Pg.786]    [Pg.61]    [Pg.111]    [Pg.690]    [Pg.149]    [Pg.120]    [Pg.2]    [Pg.31]   
See also in sourсe #XX -- [ Pg.9 ]




SEARCH



Statistical significance transformations

© 2024 chempedia.info