Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Uncorrelated variables

I Principal Component Analysis (PCA) transforms a number of correlated variables into a smaller number of uncorrelated variables, the so-called principal components. [Pg.481]

Equation 2.7-18 for the system variance is equation 2.7-21. Since the covariance of uncorrelated variables is zero, this becomes equation 2.7-23 (i.e., the system variance is the sum of the component variances). [Pg.58]

If the system distribution is the product of the component distributions (equation 2.7-23), the mean is given by equation 2.7-24 which, for uncorrelated variables becomes equation 2.7-25... [Pg.58]

In this way, the discrimination problem by means of two variables which are represented in two dimensions is reduced to one dimension by means of a new variable dv = f(xi,x2). All the reductions of dimensionality, from m to graphically presentable three- or two dimensions, happen according to this principle where uncorrelated variables are generated. [Pg.255]

The goal of factor analysis (FA) and their essential variant principal component analysis (PCA) is to describe the structure of a data set by means of new uncorrelated variables, so-called common factors or principal components. These factors characterize frequently underlying real effects which can be interpreted in a meaningful way. [Pg.264]

This is the complete factor solution which admittedly contains uncorrelated variables but all the k factors are extracted completely and no reduction... [Pg.264]

Equation 41-A3 can be checked by expanding the last term, collecting terms and verifying that all the terms of equation 41-A2 are regenerated. The third term in equation 41-A3 is a quantity called the covariance between A and B. The covariance is a quantity related to the correlation coefficient. Since the differences from the mean are randomly positive and negative, the product of the two differences from their respective means is also randomly positive and negative, and tend to cancel when summed. Therefore, for independent random variables the covariance is zero, since the correlation coefficient is zero for uncorrelated variables. In fact, the mathematical definition of uncorrelated is that this sum-of-cross-products term is zero. Therefore, since A and B are random, uncorrelated variables ... [Pg.232]

A complication arises. We learn from considerations of multiple regression analysis that when two (or more) variables are correlated, the standard error of both variables is increased over what would be obtained if equivalent but uncorrelated variables are used. This is discussed by Daniel and Wood (see p. 55 in [9]), who show that the variance of the estimates of coefficients (their standard errors) is increased by a factor of... [Pg.444]

That is, a set of correlated variables r is transformed into a new set of uncorrelated variable p. [Pg.239]

Total inverse produces cumbersome sets of equations, especially when errors are taken into account. As usual, not considering errors amounts to taking uncorrelated variables with unit variances, but offers attractive illustrative properties. Examples of application abound in literature, but do not usually give enough detail for the student to use them as practical illustrative references. For these reasons, a simple illustration with no errors will be presented, and readers interested in a complete treatment should refer to Tarantola (1987). [Pg.310]

If PCA is used for dimension reduction and creation of uncorrelated variables, the optimum number of components is crucial. This value can be estimated from a scree plot showing the accumulated variance of the scores as a function of the number of used components. More laborious but safer methods use cross validation or bootstrap techniques. [Pg.114]

Figure 11. When the ratios on each axis have no common isotope and in the absence of mass-dependent fractionation, they are statistically independent. These data represent a run of 40 cycles in which Zn/ Zn and Zn/ Zn were measured. The horizontal array (uncorrelated variables) indicates that, in the present case, the variability of the isotopic ratios can be ascribed to counting statistics and not to mass-dependent fractionation. Data acquired using the VG Plasma 54 of Lyon. Figure 11. When the ratios on each axis have no common isotope and in the absence of mass-dependent fractionation, they are statistically independent. These data represent a run of 40 cycles in which Zn/ Zn and Zn/ Zn were measured. The horizontal array (uncorrelated variables) indicates that, in the present case, the variability of the isotopic ratios can be ascribed to counting statistics and not to mass-dependent fractionation. Data acquired using the VG Plasma 54 of Lyon.
Objects do not fall exactly into the inner model space, and a residual error on each variable can be computed. These residuals are uncorrelated variables, because each significant correlation is retained in the linear model. So, the variance of residuals is a chi-square variable, the SIMCA distance, and, multiplied by a suitable coefficient obtained from the F distribution, it fixes the boundary of the class space around the model, called the SIMCA box, that corresponds to the confidents hyperellipsoid of the bayesian method. Objects, both those used and those not used to obtain the... [Pg.123]

What is desired is a logical, systematic procedure to generate a small set of uncorrelated variables which allow direct comparison, make chemical sense, and are amenable to easy interpretation. A mathematical procedure which meets these criteria is the use of factor analysis (9, 11, 12, 13) to generate statistically independent new variabTes XfacTorTT followed by correlation of toxicity with the generated factors. [Pg.641]

In employing principal components as our regression factors we have succeeded in fully utilizing all the measured variables and developed new, uncorrelated variables. In selecting which eigenvectors to use, the first employed... [Pg.196]

Transform measured variables x to unit-variance uncorrelated variables z using Eq. 3.13. PC A can accomplish this transformation. [Pg.45]

A useful method to test the overall impact a subject has on the parameters is to first perform principal component analysis (PCA) on the estimated model parameters. PCA is a multivariate statistical method the object of which is to take a set of p-variables, (Xi,X2,. ..Xn = X and find linear functions of X to produce a new set of uncorrelated variables Z, Z2,. .. Zn such that Zi contains the largest amount of variability, Z2 contains the second largest, etc. [Pg.257]

Unless otherwise specified, the discussion in the rest of this chapter will concern only uncorrelated variables. Therefore, Eqs. 2.81 and 2.84 will be used. The reader, however, should always keep in mind the assumption under which Eq. [Pg.56]

Examples of error propagation formulas for many common functions are given in this section. In all cases, uncorrelated variables are assumed. [Pg.56]


See other pages where Uncorrelated variables is mentioned: [Pg.58]    [Pg.166]    [Pg.444]    [Pg.53]    [Pg.82]    [Pg.114]    [Pg.98]    [Pg.124]    [Pg.350]    [Pg.342]    [Pg.144]    [Pg.165]    [Pg.46]    [Pg.95]    [Pg.641]    [Pg.444]    [Pg.140]    [Pg.90]    [Pg.144]    [Pg.165]    [Pg.133]    [Pg.136]    [Pg.201]    [Pg.256]    [Pg.56]   
See also in sourсe #XX -- [ Pg.220 ]

See also in sourсe #XX -- [ Pg.220 ]




SEARCH



Correlated/uncorrelated random variables

Examples of Error Propagation—Uncorrelated Variables

Uncorrelated

Uncorrelated linear combinations of variables

Uncorrelated random variables

© 2024 chempedia.info