Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Independent variables, multivariate

For example, the objects may be chemical compounds. The individual components of a data vector are called features and may, for example, be molecular descriptors (see Chapter 8) specifying the chemical structure of an object. For statistical data analysis, these objects and features are represented by a matrix X which has a row for each object and a column for each feature. In addition, each object win have one or more properties that are to be investigated, e.g., a biological activity of the structure or a class membership. This property or properties are merged into a matrix Y Thus, the data matrix X contains the independent variables whereas the matrix Ycontains the dependent ones. Figure 9-3 shows a typical multivariate data matrix. [Pg.443]

In multivariate least squares analysis, the dependent variable is a function of two or more independent variables. Because matrices are so conveniently handled by computer and because the mathematical formalism is simpler, multivariate analysis will be developed as a topic in matrix algebra rather than conventional algebra. [Pg.80]

We have already seen the normal equations in matrix form. In the multivariate case, there are as many slope parameters as there are independent variables and there is one intercept. The simplest multivariate problem is that in which there are only two independent variables and the intercept is zero... [Pg.80]

Partial least squares (PLS) projections to latent structures [40] is a multivariate data analysis tool that has gained much attention during past decade, especially after introduction of the 3D-QSAR method CoMFA [41]. PLS is a projection technique that uses latent variables (linear combinations of the original variables) to construct multidimensional projections while focusing on explaining as much as possible of the information in the dependent variable (in this case intestinal absorption) and not among the descriptors used to describe the compounds under investigation (the independent variables). PLS differs from MLR in a number of ways (apart from point 1 in Section 16.5.1) ... [Pg.399]

Another class of methods of unidimensional minimization locates a point x near x, the value of the independent variable corresponding to the minimum of /(x), by extrapolation and interpolation using polynomial approximations as models of/(x). Both quadratic and cubic approximation have been proposed using function values only and using both function and derivative values. In functions where/ (x) is continuous, these methods are much more efficient than other methods and are now widely used to do line searches within multivariable optimizers. [Pg.166]

Principal component analysis (PCA) is aimed at explaining the covariance structure of multivariate data through a reduction of the whole data set to a smaller number of independent variables. We assume that an m-point sample is represented by the nxm matrix X which collects i=l,...,m observations (measurements) xt of a column-vector x with j=, ...,n elements (e.g., the measurements of n=10 oxide weight percents in m = 50 rocks). Let x be the mean vector and Sx the nxn covariance matrix of this sample... [Pg.237]

We now proceed to m observations. The ith observation provides the estimates xi of the independent variables Xj and the estimate y, of the dependent variable Y. The n estimates xtj of the variables Xj provided by this ith observation are lumped together into the vector xt. We assume that the set of the (n+1) data (i/,y,) associated with the ith observation represent unbiased estimates of the mean ( yf) of a random (n + 1)-vector distributed as a multivariate normal distribution. The unbiased character of the estimates is equivalent to... [Pg.294]

Empirical Modeling. The effect of process variables on the rate of depKJsition and properties of electrolessly depKJsited metals is usually studied by one-factor-at-a-time experiments (one-factor experiments are discussed further later in the book). In these experiments the effect of a single variable (factor), such as Xj, in the multivariable process with the response y, y = fixi, %2, X3,. .., x ), is studied by varying the value (level) of this variable while holding the values of the other independent variable fixed, y Any prediction (extrapolation) of the effect of a single variable on... [Pg.160]

Experimental data with the independent variables L and I can be plotted, producing a 3-dimensional binding isotherm, and a multivariate regression analysis gives the association constants KSL and KSI as well as the mobility of the corresponding complexes. [Pg.50]

Multivariate source. apportionment models have been developed for two fractions of respirable particulate organic matter which together constitute about 90% of the total organic solvent-extractable mass. The independent variables used for developing the models were trace metals, water-soluble sulfate and meteorological variables. Two of the three POM fractions extracted sequentially with cyclohexane (CYC), dlchloromethane (DCM) and acetone (ACE) were used as individual dependent variables. [Pg.217]

In a A -component mixture -1 independent variables are present. The multivariate distribution will then be formed using the variances of k- ... [Pg.164]

It would now be most logical to let this probability between a and b be the RC, but in case of more than one independent variable with a multivariate error distribution it is a very complicated problem to calculate an almost always asymmetrical part of this distribution. To handle this problem the... [Pg.174]

T. Steememan and A. Ronner, Testing independence in multivariate linear regression when the number of variables increases. Internal Report, Economics Institute, University of Groningen, 1984. [Pg.341]

PLS falls in the category of multivariate data analysis whereby the X-matrix containing the independent variables is related to the Y-matrix, containing the dependent variables, through a process where the variance in the Y-matrix influences the calculation of the components (latent variables) of the X-block and vice versa. It is important that the number of latent variables is correct so that overfitting of the model is avoided this can be achieved by cross-validation. The relevance of each variable in the PLS-metfiod is judged by the modelling power, which indicates how much the variable participates in the model. A value close to zero indicates an irrelevant variable which may be deleted. [Pg.103]

The term factor is a catch-all for the concept of an identifiable property of a system whose quantity value might have some effect on the response. Factor tends to be used synonymously with the terms variable and parameter, although each of these terms has a special meaning in some branches of science. In factor analysis, a multivariate method that decomposes a data matrix to identify independent variables that can reconstitute the observed data, the term latent variable or latent factor is used to identify factors of the model that are composites of input variables. A latent factor may not exist outside the mathematical model, and it might not therefore influence... [Pg.69]

This section introduces the regression theory that is needed for the establishment of the calibration models in the forthcoming sections and chapters. The multivariate linear models considered in this chapter relate several independent variables (x) to one dependent variable (y) in the form of a first-order polynomial ... [Pg.164]

Experimental data and calculated data for the photodegradation of chlo-rophenols were compared using multivariate analysis and SPSS statistical software. A linear expression was developed using the independent variables Kow and ores and the dependent variable log k/kQ. Calculated values for log k/k0 were obtained from the following linear equation ... [Pg.374]

According to the number n of included independent variables xb x (influencing quantities, features, factors,. ..) we distinguish one-factorial designs (n = 1) and multifactorial designs (n > 1). According to the number m of recorded response features v i,. .., ym we will get univariate results (m = 1) or multivariate results (m > 1). Therefore we can arrange our numerical results in data matrices with k lines, the experiments, and m... [Pg.71]

Dependencies are mainly investigated by multiple or multivariate regression (one dependent and several independent variables), by multidimensional multivariate regression or partial least squares regression (several dependent and several independent variables), and by the method of simultaneous equations (explicitly allowing for corre-... [Pg.139]

Such analysis demands many cases in the series similar to multivariate model computations. Now a multiple regression analysis with the independent variables jan, feb, mar,. .., summ and the number variable num, and the nitrate time series as the dependent variable is started. [Pg.221]

An awareness of the influences of various factors on the biodiesel will give a clear picture of the possible independent variables which can be manipulated to improve or optimize production (the dependent variables). Among dozens of possible candidates, use of some is limited by consumer habit or by the unavailability of the required tool, while others may give a higher priority to those variables that are convenient to adjust, and leave those that demand more time and effort for later study. After screening, the remaining few may thus form the basis for a focused study. However, optimization is not easy even with only two or three variables. RSM has been developed particularly for the optimization of sophisticated multivariable systems where the quantitative relationship between key variables is not always clear, as is the case with a complex operation such as transesterification. RSM allows simultaneous consideration of more than one variable at different levels, and of the corresponding interactions between these variables, on the basis of a relatively small number of experiments. [Pg.168]

A detailed explanation on the derivation of PCA is available in a review paper by Svante (63). In many ways, PLS is similar to PCA except that it looks for correlations between matrices or a matrix and a vector. Robust calibrations can be created using PLS becau,se the correlation is multivariate in nature, making it less susceptible to random noise. PL/S captures the highest variance in the data set and correlates both data blocks simultaneously. Unlike PCA where independent score. sets for each data block is calculated, a common link or weight loading vector (w) is calculated. A regression coefficient, , is used to predict independent variables. [Pg.331]


See other pages where Independent variables, multivariate is mentioned: [Pg.418]    [Pg.523]    [Pg.744]    [Pg.222]    [Pg.307]    [Pg.118]    [Pg.412]    [Pg.945]    [Pg.16]    [Pg.133]    [Pg.624]    [Pg.117]    [Pg.19]    [Pg.404]    [Pg.183]    [Pg.81]    [Pg.189]    [Pg.230]    [Pg.260]    [Pg.155]    [Pg.448]    [Pg.169]    [Pg.2]    [Pg.568]    [Pg.310]    [Pg.468]    [Pg.172]    [Pg.389]   


SEARCH



Multivariate variables

Variable independent

© 2024 chempedia.info