Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Singular-value-decomposition

The use of singular value decomposition (SVD), introduced into chemical engineering by Moore and Downs Proc. JACC, paper WP-7C, 1981) can give some guidance in the question of what variables to control. They used SVD to select the best tray temperatures. SVD involves expressing the matrix of plant transfer function steady state gains as the product of three matrices a V matrix, a diagonal Z matrix, and a matrix. [Pg.596]

It is somewhat similar to canonical transformation. But it is different in that the diagonal 2 matrix contains as its diagonal elements, not the eigenvalues of the Kj, matrix, but its singular values. [Pg.596]

The biggest elements in each column of the U matrix indicate which outputs of the process are the most sensitive. Thus SVD can be used to help select which tray temperatures in a distillation column should be controlled. Example 17.1 from the Moore and Downs paper illustrates the procedure. [Pg.596]

Example 17.1. A nine-tray distillation column separating isopropanol and water has the following steadystate gains between tray temperatures and the manipulated [Pg.596]

The elements in this table are the elements in the steaifystate gain matrix of the column, which has 9 rows and 2 columns. [Pg.597]

In NIPALS one starts with an initial vector t with n arbitrarily chosen values (Fig. 31.12). In a first step, the matrix product of the transpose of the nxp table X with the n-vector t is formed, producing the p elements of vector w. Note that in the traditional NIPALS notation, w has a different meaning than that of a weighting vector which has been used in Section 31.3.6. In a second step, the elements of the p-vector w are normalized to unit sum of squares This prevents values from becoming too small or too large for the purpose of numerical computation. The [Pg.134]

The corresponding latent value X, is then defined by means of  [Pg.136]

A crucial operation in the NIPALS algorithm is the calculation of the residual data matrix which is independent of the contributions by the first singular vector. This can be produced by the instruction  [Pg.136]

By construction, all singular vectors in U and V are normalized and mutually orthogonal. [Pg.136]

The NIPALS algorithm is easy to program, particularly with a matrix-oriented computer notation, and is highly efficient when only a few latent vectors are required, such as for the construction of a two-dimensional biplot. It is also suitable for implementation in personal or portable computers with limited hardware resources. [Pg.136]

3 Other Matrix Capabilities in MATLAB 2.2.3.1 Singular Value Decomposition [Pg.64]

PCA is a statistical technique that has been used ubiquitously in multivariate data analysis. Given a set of input vectors described by partially cross-correlated variables, the PCA will transform them into a set that is described by a smaller number of orthogonal variables, the principle components, without a significant loss in the variance of the data. The principle components correspond to the eigenvectors of the covariance matrix, m, a symmetric matrix that contains the variances of the variables in its diagonal elements and the covariances in its off-diagonal elements (15)  [Pg.148]

The eigenvalues of this matrix represent the variances of the principal components. PCA reduces the dimensionality by eliminating variables that contribute the least to the variance of the data, i.e., those with the smallest eigenvalues. After diagonaliza-tion of the covariance matrix, the original data can be transformed by, in (17). [Pg.148]

Singular value decomposition (S VD) is a method similar to PCA in that it allows for the determination of the rank of a matrix. An M X N matrix, M, can be expressed as the product of U, an orthogonal M X M matrix, 5, an M X N diagonal matrix with real, nonnegative elements, and W, the transpose of an orthogonal N X N matrix Eq(18). [Pg.148]

If the rank of M is less than M and N, then some elements of the matrix S will be zero, and the number of nonnegative elements will equal the rank of M. SVD has been used successfully to reduce the dimensionality of descriptor space of chemical libraries. Xie and co-workers report mapping chemicals with 10% of high-dimensional distances into two dimensions ranging from 30 to 100%, depending on the types of compounds and descriptors used. [Pg.149]


Let u be a vector valued stochastic variable with dimension D x 1 and with covariance matrix Ru of size D x D. The key idea is to linearly transform all observation vectors, u , to new variables, z = W Uy, and then solve the optimization problem (1) where we replace u, by z . We choose the transformation so that the covariance matrix of z is diagonal and (more importantly) none if its eigenvalues are too close to zero. (Loosely speaking, the eigenvalues close to zero are those that are responsible for the large variance of the OLS-solution). In order to liiid the desired transformation, a singular value decomposition of /f is performed yielding... [Pg.888]

Hendler R W and Shrager R I 1994 Deconvolutions based on singular value decomposition and the pseudoinverse—a guide for beginners J. Blochem. Blophys. Methods 28 1-33... [Pg.2970]

Furthermore, one may need to employ data transformation. For example, sometimes it might be a good idea to use the logarithms of variables instead of the variables themselves. Alternatively, one may take the square roots, or, in contrast, raise variables to the nth power. However, genuine data transformation techniques involve far more sophisticated algorithms. As examples, we shall later consider Fast Fourier Transform (FFT), Wavelet Transform and Singular Value Decomposition (SVD). [Pg.206]

It may look weird to treat the Singular Value Decomposition SVD technique as a tool for data transformation, simply because SVD is the same as PCA. However, if we recall how PCR (Principal Component Regression) works, then we are really allowed to handle SVD in the way mentioned above. Indeed, what we do with PCR is, first of all, to transform the initial data matrix X in the way described by Eqs. (10) and (11). [Pg.217]

Widely used methods of data transformation are Fast Fourier and Wavelet Transformations or Singular Value Decomposition... [Pg.224]

C. F. Moore, "AppHcation of Singular Value Decomposition to the Design, Analysis, and Control of Industrial Processes," Proceeding of American Control Conference, Boston, Mass., 1986, p. 643. [Pg.80]

Xie D, Tropsha A, Schlick T. An efficient projection protocol for chemical databases singular value decomposition combined with truncated-newton minimization. / Chem Inf Comput Sci 2000 40 167-77. [Pg.373]

The scaled data matrix D is decomposed using singular value decomposition (see Bonvin and Rippin (1990), Hamer (1989), Golub and van Loan (1983)) into matrices with one containing stoichiometric information which can be processed into acceptable stoichiometry. The decomposition can be easily done by any available software packages (e.g. Dongarra etal. (1979), Press et ai, (1989)). Upon decomposing one obtains ... [Pg.529]

Singular value decomposition (SVD) of a rectangular matrix X is a method which yields at the same time a diagonal matrix of singular values A and the two matrices of singular vectors U and V such that ... [Pg.40]

An important theorem of matrix algebra, called singular value decomposition (SVD), states that any nxp table X can be written as the matrix product of three terms U, A and V ... [Pg.89]

It can be proved that the decomposition is always possible and that the solution is unique (except for the algebraic signs of the columns of U and V) [3]. Singular value decomposition of a rectangular table is an extension of the classical work of Eckart and Young [4] on the decomposition of matrices. The decomposition of X into U, V and A is illustrated below using a 4x3 data table which has been adapted from V an Borm [5]. (A similar example has been used for the introduction to PC A in Chapter 17.)... [Pg.89]

In the previous section we have developed principal components analysis (PCA) from the fundamental theorem of singular value decomposition (SVD). In particular we have shown by means of eq. (31.1) how an nxp rectangular data matrix X can be decomposed into an nxr orthonormal matrix of row-latent vectors U, a pxr orthonormal matrix of column-latent vectors V and an rxr diagonal matrix of latent values A. Now we focus on the geometrical interpretation of this algebraic decomposition. [Pg.104]

J. Mandel, Use of the singular value decomposition in regression analysis. Am. Statistician, 36 (1982) 15-24. [Pg.158]

G. H. Golub and C. Reinsch, Singular value decomposition and least squares solutions. Numer. [Pg.159]

Correspondence factor analysis can be described in three steps. First, one applies a transformation to the data which involves one of the three types of closure that have been described in the previous section. This step also defines two vectors of weight coefficients, one for each of the two dual spaces. The second step comprises a generalization of the usual singular value decomposition (SVD) or eigenvalue decomposition (EVD) to the case of weighted metrics. In the third and last step, one constructs a biplot for the geometrical representation of the rows and columns in a low-dimensional space of latent vectors. [Pg.183]

In order to accelerate the minimization of Eq. (2.7.12), the data and the kernel can be compressed to a smaller number of variables using singular value decompositions (SVD) of K,... [Pg.170]

Secondly, although stable solutions covering the entire temporal range of interest are attainable, the spectra may not be well resolved that is, for a given dataset and noise, a limit exists on the smallest resolvable structure (or separation of structures) in the Laplace inversion spectrum [54]. Estimates can be made on this resolution parameter based on a singular-value decomposition analysis of K and the signal-to-noise ratio of the data [56], It is important to keep in mind the concept of the spectral resolution in order to interpret the LI results, such as DDIF, properly. [Pg.347]

Here the pair-force fj (r, r -) is unknown, so a model pair-force fij(r , rj, p, P2 pm) is chosen, which depends linearly upon m unknown parameters p, p2 - Pm- Consequently, the set of Eq. (8-2) is a system of linear equations with m unknowns p, P2 - - Pm- The system (8-2) can be solved using the singular value decomposition (SVD) method if n > m (over-determined system), and the resulting solution will be unique in a least squares sense. If m > n, more equations from later snapshots along the MD trajectory should be added to the current set so that the number of equations is greater than the number of unknowns. Mathematically, n = qN > m where q is the number of MD snapshots used to generate the system of equations. [Pg.203]

Clearly, the total number of unknowns that need to be determined is m = a + +. .. + z and a solution set for parameters p, P2 pm is determined using the singular value decomposition or any other suitable method. The mean pair-force corresponding to the potential of mean force can be obtained in a systematic manner by averaging a number of sets of solutions for parameters p, P2 Pm obtained along the atomistic MD trajectory in which the phase space is sampled extensively. [Pg.203]


See other pages where Singular-value-decomposition is mentioned: [Pg.1982]    [Pg.2967]    [Pg.217]    [Pg.467]    [Pg.503]    [Pg.102]    [Pg.102]    [Pg.204]    [Pg.141]    [Pg.121]    [Pg.223]    [Pg.134]    [Pg.134]    [Pg.136]    [Pg.140]    [Pg.183]    [Pg.320]    [Pg.14]    [Pg.161]    [Pg.276]    [Pg.42]    [Pg.42]    [Pg.42]   
See also in sourсe #XX -- [ Pg.206 , Pg.216 ]

See also in sourсe #XX -- [ Pg.40 , Pg.89 , Pg.183 ]

See also in sourсe #XX -- [ Pg.170 , Pg.347 ]

See also in sourсe #XX -- [ Pg.423 ]

See also in sourсe #XX -- [ Pg.127 ]

See also in sourсe #XX -- [ Pg.145 , Pg.147 , Pg.148 , Pg.150 , Pg.151 , Pg.157 , Pg.159 , Pg.161 , Pg.169 , Pg.170 , Pg.172 , Pg.315 ]

See also in sourсe #XX -- [ Pg.191 ]

See also in sourсe #XX -- [ Pg.181 , Pg.214 , Pg.260 , Pg.268 ]

See also in sourсe #XX -- [ Pg.596 ]

See also in sourсe #XX -- [ Pg.177 ]

See also in sourсe #XX -- [ Pg.86 ]

See also in sourсe #XX -- [ Pg.89 ]

See also in sourсe #XX -- [ Pg.108 ]

See also in sourсe #XX -- [ Pg.103 , Pg.108 , Pg.112 ]

See also in sourсe #XX -- [ Pg.49 ]

See also in sourсe #XX -- [ Pg.488 ]

See also in sourсe #XX -- [ Pg.165 ]

See also in sourсe #XX -- [ Pg.74 , Pg.93 ]

See also in sourсe #XX -- [ Pg.208 ]

See also in sourсe #XX -- [ Pg.632 ]

See also in sourсe #XX -- [ Pg.65 ]

See also in sourсe #XX -- [ Pg.580 ]

See also in sourсe #XX -- [ Pg.18 , Pg.28 ]

See also in sourсe #XX -- [ Pg.127 ]

See also in sourсe #XX -- [ Pg.360 , Pg.426 , Pg.427 ]

See also in sourсe #XX -- [ Pg.39 , Pg.262 ]

See also in sourсe #XX -- [ Pg.91 , Pg.93 ]

See also in sourсe #XX -- [ Pg.2 , Pg.25 , Pg.266 , Pg.416 ]

See also in sourсe #XX -- [ Pg.103 ]

See also in sourсe #XX -- [ Pg.145 , Pg.147 , Pg.148 , Pg.150 , Pg.151 , Pg.157 , Pg.159 , Pg.161 , Pg.169 , Pg.170 , Pg.172 , Pg.315 ]

See also in sourсe #XX -- [ Pg.486 ]

See also in sourсe #XX -- [ Pg.458 ]

See also in sourсe #XX -- [ Pg.94 , Pg.95 , Pg.97 , Pg.98 , Pg.180 , Pg.190 , Pg.191 ]

See also in sourсe #XX -- [ Pg.89 ]

See also in sourсe #XX -- [ Pg.141 , Pg.143 , Pg.148 , Pg.149 , Pg.153 , Pg.157 , Pg.158 , Pg.233 , Pg.235 , Pg.245 , Pg.246 , Pg.257 ]

See also in sourсe #XX -- [ Pg.164 ]

See also in sourсe #XX -- [ Pg.283 , Pg.290 , Pg.316 ]

See also in sourсe #XX -- [ Pg.507 , Pg.510 , Pg.516 , Pg.518 ]

See also in sourсe #XX -- [ Pg.421 ]

See also in sourсe #XX -- [ Pg.39 , Pg.262 ]

See also in sourсe #XX -- [ Pg.17 , Pg.18 ]

See also in sourсe #XX -- [ Pg.64 ]

See also in sourсe #XX -- [ Pg.261 ]

See also in sourсe #XX -- [ Pg.214 ]

See also in sourсe #XX -- [ Pg.25 , Pg.26 ]

See also in sourсe #XX -- [ Pg.25 , Pg.352 ]

See also in sourсe #XX -- [ Pg.536 ]

See also in sourсe #XX -- [ Pg.141 , Pg.142 , Pg.143 , Pg.144 , Pg.145 , Pg.146 , Pg.147 ]




SEARCH



A Useful Analytical Technique Singular-Value Decomposition Followed by Global Fitting

B proof that the weights in trilinear PLS1 can be obtained from a singular value decomposition

Eigenvalue analysis Singular Value Decomposition

Generalized singular value decomposition

Hankel singular value decomposition

Linear algebra Singular Value Decomposition

Linear prediction singular value decomposition

Matlab Singular Value Decomposition

Matrix computations singular-value decomposition

Matrix singular value decomposition

Singular

Singular Value Decomposition Algebra

Singular Value Decomposition matrix inverse

Singular value decomposition algorithm

Singular value decomposition analysis

Singular value decomposition component estimation

Singular value decomposition definition

Singular value decomposition evaluation

Singular value decomposition method

Singular value decomposition orthogonal matrices

Singular value decomposition principal component analysis

Singular value decomposition separation

Singular value decomposition theorem

Singular-value decomposition (SVD

Singularities

The Singular Value Decomposition and Least Squares Problems

The Singular Value Decomposition, SVD

© 2024 chempedia.info