Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Dimensional-reduction method

Readers with experience in chemometrics will have noticed that, like principal components analysis (PCA), MDS is a dimensionality reduction method. For each molecule, a large number of attributes (similarity to each other molecule) is reduced to a much smaller number of coordinates in an abstract property space, which reproduce the original data within an established error. The pertinent difference is that PCA uses the matrix of correlations between a set of (redundant) properties, which are usually obtained from a table of those properties for an initial set of molecules. In contrast, MDS uses a matrix of similarities between each pair of molecules (or substituents). [Pg.79]

Collective Variables Based on Dimensional Reduction Methods... [Pg.32]

Linear approaches to spectral dimensionality reduction make the assumption that the data lies on or near a low-dimensional subspace. In such cases, linear spectral dimensionality reduction methods seek to learn the basis vectors of this low-dimensional subspace so that the input data can be projected onto the linear subspace. The two main methods for linear spectral dimensionality reduction. Principal Components Analysis and Multidimensional Scaling, are both described in this section. Although more powerful nonlinear approaches have been presented in recent years, these linear techniques are still widely used and are worthy of attention since they provide the basis for some of the subsequent nonlinear spectral dimensionality reduction algorithms. [Pg.9]

Isomap [2], one of the first true nonlinear spectral dimensionality reduction methods, extends metric MDS to handle nonlinear manifolds. Whereas metric MDS measures inter-point EucUdean distances to obtain a feature matrix. Isomap measures the interpoint manifold distances by approximating geodesics. The use of manifold distances can often lead to a more accurate and robust measure of distances between points so that points that are far away according to manifold distances, as measured in the high-dimensional space, are mapped as far away in the low-dimensional space (Fig. 2.3). An example low-dimensional embedding of the S-Curve dataset (Fig. 2.1) found using Isomap is given in Fig. 2.4. [Pg.12]

An interesting yet often overlooked area within dimensionality reduction is the case where the data is drawn from multiple manifolds, that is, rather than the data being sampled from a single manifold, X c it is in fact made up of data sampled from more than one manifold, X = (Xi, X2,..., Xp where Xi c X2 c. A, . ..,Xp c Without extensions, traditional spectral dimensionality reduction methods will often fail to recover the underlying manifolds of such datasets. [Pg.32]

It is not unreasonable to expect data drawn from real world experiments to contain high levels of noise and/or outliers. Spectral dimensionality reduction methods are highly susceptible to noise, as shown in Fig. 3.4. As the noise level increases the measured performance of spectral dimensionality reduction decreases. This is not surprising as high noise levels will make it difficult for the underlying manifold to be adequately and accurately modelled. Therefore, there is need for methods to be used that enable spectral dimensionality reduction methods to be employed in the presence of noisy data. [Pg.34]

When using those spectral dimensionality reduction methods that do not exhibit a prominent gap in the spectrum of eigenvalues other intrinsic dimensionality estimation methods need to be employed. A coarse split in these algorithms can be made by examining whether they estimate the intrinsic dimensionality at a local or a global scale. The rest of this section will follow this grouping and discuss the advantages of each approach. [Pg.44]

As such, care should be taken when seeking to embed a dataset into higher dimensions. The performance of spectral dimensionality reduction methods, and in fact nonlinear dimensionality reduction methods in general, is called into question in such cases. [Pg.50]

As with the problem of estimating intrinsic dimensionality (Chap. 4), some spectral dimensionality reduction methods can be seen to provide a solution to the problem naturally . That is, new data points can be mapped into the low-dimensional space without the need for an extra algorithm. The simplest of such methods is PCA [1] which can be thought of as learning a transformation matrix that projects points from the high-dimensional to the low-dimensional space. Recall from Sect. 2.2.1 that PCA computes the covariance matrix F that corresponds to the amount the data varies in... [Pg.53]

Fig. 7.1 Citations of four of the leading spectral dimensionality reduction methods taken between 2001 and 2010. The citation counts were obtained using the web of knowledge service (http //woL mimas.ac.uk)... Fig. 7.1 Citations of four of the leading spectral dimensionality reduction methods taken between 2001 and 2010. The citation counts were obtained using the web of knowledge service (http //woL mimas.ac.uk)...

See other pages where Dimensional-reduction method is mentioned: [Pg.110]    [Pg.275]    [Pg.1486]    [Pg.296]    [Pg.235]    [Pg.3]    [Pg.11]    [Pg.23]    [Pg.25]    [Pg.26]    [Pg.31]    [Pg.65]    [Pg.79]    [Pg.80]    [Pg.97]    [Pg.97]   
See also in sourсe #XX -- [ Pg.1486 ]




SEARCH



Dimensional reduction

Dimensionality reduction

Dimensionality reduction methods

Dimensionality reduction methods

Reduction methods

© 2024 chempedia.info