Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Spectral dimensionality reduction

Abstract In this chapter a common mathematical framework is provided which forms the basis for subsequent chapters. Generic aspects are covered, after which specific dimensionality reduction approaches are briefly described. [Pg.7]

Keywords Spectral dimensionality reduction algorithms Spectral graph theory. [Pg.7]

Before addressing the open problems it is important to have an understanding of the problem domain itself along with the techniques that have been proposed to perform spectral dimensionality reduction. To fully understand and appreciate the open problems they need to be described in terms of a common mathematical framework. By doing so the problems described in the latter sections can be coherently addressed in relation to a common frame of reference. [Pg.7]

This section begins by providing a general mathematical setting within which both spectral dimensionality reduction, and the associated open problems, can be described. Then key algorithms, both linear and nonlinear, are briefly described so as to provide an important point of reference and discussion for the later discussion of open problems. [Pg.7]


Keywords Manifold learning Spectral dimensionality reduction Medical image... [Pg.2]

Such practical and theoretical motivations drive the need for automatic methods that can reduce the dimensionality of a dataset in an intelligent way. Spectral dimensionality reduction is one such family of methods. Spectral dimensionality reduction seeks to transform the high-dimensional data into a lower dimensional space that retains certain properties of the subspace or sub manifold upon which the data lies. This transformation is achieved via the spectral decomposition of a square symmetric... [Pg.2]

H. Strange and R. Zwiggelaar, Open Problems in Spectral Dimensionality Reduction, SpringerBriefs in Computer Science, DOI 10.1007/978-3-319-03943-5 l,... [Pg.2]

The purpose of this book is to organise and directly address some of the key problems that are associated with spectral dimensionality reduction. As already mentioned, all of these problems arise from certain assumptions made either by spectral dimensionality reduction algorithms, or users of these algorithms, about the problem domain. [Pg.3]

As such, there is a lively interplay between the algorithms themselves, the general setting within which they are framed, the assumptions that specific algorithms make, and the problems that arise from such assumptions. This book can therefore be used as a reference for those seeking to quickly understand spectral dimensionality reduction and the potential solutions to various problems. It can also serve as an introduction to the field for those new to spectral dimensionality reduction. [Pg.4]

To effectively analyse spectral dimensionality reduction and the associated problems it is useful to frame the methodology within a general setting. As the name suggests, at the heart of spectral dimensionality reduction is the spectral decomposition of a square symmetric feature matrix. Different techniques can be distinguished based on the construction of this feature matrix and the eigenvectors that are subsequently used (i.e. smallest or largest). This feature matrix aims to capture certain properties of the... [Pg.7]

A more formal definition of spectral dimensionality reduction can be obtained by filling in some of the gaps found in Definition 1. As previously mentioned, a feature matrix is built from X that aims to capture certain properties of the data and will often represent subspace or submanifold properties. Given the original data X, the feature matrix F is built such that... [Pg.8]

It is this similarity matrix that distinguishes various spectral dimensionality reduction techniques. For example, F could measure the covariance of X as in Principal Components Analysis [1], or the geodesic interpoint distances as in Isomap [2]. [Pg.8]

Linear approaches to spectral dimensionality reduction make the assumption that the data lies on or near a low-dimensional subspace. In such cases, linear spectral dimensionality reduction methods seek to learn the basis vectors of this low-dimensional subspace so that the input data can be projected onto the linear subspace. The two main methods for linear spectral dimensionality reduction. Principal Components Analysis and Multidimensional Scaling, are both described in this section. Although more powerful nonlinear approaches have been presented in recent years, these linear techniques are still widely used and are worthy of attention since they provide the basis for some of the subsequent nonlinear spectral dimensionality reduction algorithms. [Pg.9]

The squared distance matrix in its raw form is not positive semi-definite, so cannot be used as the feature matrix for spectral dimensionality reduction. Therefore, it needs to be converted to a Gram, or inner-product, matrix through the following transformation ... [Pg.10]

Nonlinear spectral dimensionality reduction techniques seek to alleviate this problem by modelling the data not using a subspace, but a submanifold. The data is... [Pg.11]

This section reviews some of the most popular methods for nonlinear spectral dimensionality reduction. This list of methods is by no means exhaustive, rather, the methods included in this section are chosen for their didactic value and also their popularity. Each method corresponds to an important and different paradigm to spectral dimensionality reduction as such, they are each landmarks within the landscape of spectral dimensionality reduction and provide a brief but sufficient survey of the main trends within this area. [Pg.11]

Fig. 23 Points sampled from a simple horseshoe shaped manifold (a). The two distances in (b) show the difference between distances as measured across the manifold and the Euclidean distance. The two end points are connected by the dotted line according to the Euclidean distance. However, their manifold distance would be the sum of inter-point distances on the path between the two points. For nonlinear spectral dimensionality reduction techniques, the manifold distances should be used so that the two end points are mapped as far away in the low-dimensional space... Fig. 23 Points sampled from a simple horseshoe shaped manifold (a). The two distances in (b) show the difference between distances as measured across the manifold and the Euclidean distance. The two end points are connected by the dotted line according to the Euclidean distance. However, their manifold distance would be the sum of inter-point distances on the path between the two points. For nonlinear spectral dimensionality reduction techniques, the manifold distances should be used so that the two end points are mapped as far away in the low-dimensional space...
Isomap [2], one of the first true nonlinear spectral dimensionality reduction methods, extends metric MDS to handle nonlinear manifolds. Whereas metric MDS measures inter-point EucUdean distances to obtain a feature matrix. Isomap measures the interpoint manifold distances by approximating geodesics. The use of manifold distances can often lead to a more accurate and robust measure of distances between points so that points that are far away according to manifold distances, as measured in the high-dimensional space, are mapped as far away in the low-dimensional space (Fig. 2.3). An example low-dimensional embedding of the S-Curve dataset (Fig. 2.1) found using Isomap is given in Fig. 2.4. [Pg.12]

Spectral dimensionality reduction seeks to obtain a low-dimensional embedding of a high-dimensional dataset through the eigendecomposition of a specially constructed feature matrix. This feature matrix will capture certain properties of the data such as inter-point covariance or local linear reconstruction weights. The different methods of formulating this feature matrix will have different implications for various open problems, as will be seen in later chapters. [Pg.21]

Lawrence, N.D. A umfying probabilistic perspective for spectral dimensionality reduction Insights and new models. Joumtil of Machine Learning Research 13,1609-1638 (2012)... [Pg.22]


See other pages where Spectral dimensionality reduction is mentioned: [Pg.1]    [Pg.3]    [Pg.3]    [Pg.3]    [Pg.4]    [Pg.5]    [Pg.5]    [Pg.7]    [Pg.7]    [Pg.8]    [Pg.8]    [Pg.9]    [Pg.9]    [Pg.9]    [Pg.10]    [Pg.11]    [Pg.11]    [Pg.11]    [Pg.12]    [Pg.13]    [Pg.14]    [Pg.15]    [Pg.16]    [Pg.17]    [Pg.18]    [Pg.19]    [Pg.20]    [Pg.20]    [Pg.22]    [Pg.23]   
See also in sourсe #XX -- [ Pg.7 ]




SEARCH



Dimensional reduction

Dimensionality reduction

© 2024 chempedia.info