Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Linear discriminant analysis separation, classes

A first distinction which is often made is that between methods focusing on discrimination and those that are directed towards modelling classes. Most methods explicitly or implicitly try to find a boundary between classes. Some methods such as linear discriminant analysis (LDA, Sections 33.2.2 and 33.2.3) are designed to find explicit boundaries between classes while the k-nearest neighbours (A -NN, Section 33.2.4) method does this implicitly. Methods such as SIMCA (Section 33.2.7) put the emphasis more on similarity within a class than on discrimination between classes. Such methods are sometimes called disjoint class modelling methods. While the discrimination oriented methods build models based on all the classes concerned in the discrimination, the disjoint class modelling methods model each class separately. [Pg.208]

In the class discrimination methods or hyperplane techniques, of which linear discriminant analysis and the linear learning machine are examples, the equation of a plane or hyperplane is calculated that separates one class from another. These methods work well if prior knowledge allows the analyst to assume that the test objects must... [Pg.244]

Because of the aforementioned EDA hypotheses, the ellipses of different categories present equal eccentricity and axis orientation they only differ for their location in the plane. By coimecting the intersection points of each couple of corresponding ellipses, a straight line is identified which corresponds to the delimiter between the two classes (see Eig. 2.15B). Eor this reason, this technique is called linear discriminant analysis. The directions which maximize the separation between classes are called EDA canonical variables. [Pg.88]

Linear discriminant analysis (LDA) [41] separates two data classes of feature vectors by constructing a hyperplane defined by a linear discriminant function ... [Pg.222]

Linear discriminant analysis (LDA) is aimed at finding a linear combination of descriptors that best separate two or more classes of objects [100]. The resulting transformation (combination) may be used as a classifier to separate the classes. LDA is closely related to principal component analysis and partial least square discriminant analysis (PLS-DA) in that all three methods are aimed at identifying linear combinations of variables that best explain the data under investigation. However, LDA and PLS-DA, on one hand, explicitly attempt to model the difference between the classes of data whereas PCA, on the other hand, tries to extract common information for the problem at hand. The difference between LDA and PLS-DA is that LDA is a linear regression-like method whereas PLS-DA is a projection technique... [Pg.392]

Discriminant plots were obtained for the adaptive wavelet coefficients which produced the results in Table 2. Although the classifier used in the AWA was BLDA, it was decided to supply the coefficients available upon termination of the AWA to Fisher s linear discriminant analysis, so we could visualize the spatial separation between the classes. The discriminant plots are produced using the testing data only. There is a good deal of separation for the seagrass data (Fig. 5), while for the paraxylene data (Fig. 6) there is some overlap between the objects of class I and 3. Quite clearly, the butanol data (Fig. 7) post a challenge in discriminating between the two classes. [Pg.447]

Discriminant analysis is a supervised learning technique which uses classified dependent data. Here, the dependent data (y values) are not on a continuous scale but are divided into distinct classes There are often just two classes (e.g. active/inactive soluble/not soluble yes/no), but more than two is also possible (e.g. high/medium/low, 1/2/3/4). The simplest situation involves two variables and two classes, and the aim is to find a straight line that best separates the data into its classes (Figure 12.37). With more than two variables, the line becomes a hyperplane in the multidimensional variable space. Discriminant analysis is characterised by a discriminant function, which in the particular case of linear discriminant analysis (the most popular variant) is written as a linear combination of the independent variables ... [Pg.703]

The adaptive least squares (ALS) method [396, 585 — 588] is a modification of discriminant analysis which separates several activity classes e.g. data ordered by a rating score) by a single discriminant function. The method has been compared with ordinary regression analysis, linear discriminant analysis, and other multivariate statistical approaches in most cases the ALS approach was found to be superior to categorize any numbers of classes of ordered data. ORMUCS (ordered multicate-gorial classification using simplex technique) [589] is an ALS-related approach which... [Pg.100]

Such a diagnosis is obtained best using supervised or trained algorithms. A number of such predictive algorithms exist, some very similar to that of the principles of PCA (i.e. soft independent modeling of class analogy, SIMCA), linear discriminant analysis (LDA) to more compUcated classifiers such as support vector machines (SVMs), which are based on separating spectral classes by complicated, multidimensional separation planes [30]. At the LSpD, ANNs have been used [52-54] for supervised prediction of class memberships. [Pg.208]

Discriminant analysis belongs to supervised pattern recognition and has the aim to assign objects to one of several pre-determined classes. Only the two-class problem will be treated here briefly. Linear discriminant analysis (LDA) uses a latent variable (discriminant variable) in the feature space that maximally separates two classes of objects. A widely used criterion for class separation is the r-value as known from the statistical r-test (Student test) as defined in equations (19)-(21), with niA, ms being the means, va, vb the variances, a, the number of objects of the classes A and B, respectively v is called the pooled variance. [Pg.353]

Linear discriminant analysis (LDA), originally proposed by Fisher in 1936 [8], is the oldest and most studied supervised pattern recognition method. As the name suggests, it is a linear technique, that is the decision boundaries separating the classes in the multidimensional space of the variables are linear surfaces (hyperplanes). From a probabilistic standpoint, it is a parametric method, as its underlying hypothesis is that, for each category, the data follow a multivariate normal distribution. This means that the likelihood in Equation (2), for each class, is defined as... [Pg.192]

The principle of multivariate analysis of variance and discriminant analysis (MVDA) consists in testing the differences between a priori classes (MANOVA) and their maximum separation by modeling (MDA). The variance between the classes will be maximized and the variance within the classes will be minimized by simultaneous consideration of all observed features. The classification of new objects into the a priori classes, i.e. the reclassification of the learning data set of the objects, takes place according to the values of discriminant functions. These discriminant functions are linear combinations of the optimum set of the original features for class separation. The mathematical fundamentals of the MVDA are explained in Section 5.6. [Pg.332]

When using a linear method, such as LDA, the underlying assumption is that the two classes are linearly separable. This, of course, is generally not true. If linear separability is not possible, then with enough samples, the more powerful quadratic discriminant analysis (QDA) works better, because it allows the hypersurface that separates the classes to be curved (quadratic). Unfortunately, the clinical reality of small-sized data sets denies us this choice. [Pg.105]

Discriminant analysis (Figure 31) [41,487, 577 — 581] separates objects with different properties, e.g. active and inactive compounds, by deriving a linear combination of some other features e.g. of different physicochemical properties), which leads to the best separation of the individual classes. Discriminant analysis is also appropriate for semiquantitative data and for data sets, where activities are only characterized in qualitative terms. As in pattern recognition, training sets are used to derive a model and its stability and predictive ability is checked with the help of different test sets. [Pg.100]

Separation of overlapping classes is not feasible with methods such as discriminant analysis because they are based on optimal separating hyperplanes. SVMs provide an efficient solution to separating nonhnear boundaries by constructing a linear boundary in a large, transformed version of the feature space. [Pg.198]

The result from cluster analysis presented in Fig. 9-2 is subjected to MVDA (for mathematical fundamentals see Section 5.6 or [AHRENS and LAUTER, 1981]). The principle of MVDA is the separation of predicted classes of objects (sampling points). In simultaneous consideration of all the features observed (heavy metal content), the variance of the discriminant functions is maximized between the classes and minimized within them. The classification of new objects into a priori classes or the reclassification of the learning data set is carried out using the values of the discriminant function. These values represent linear combinations of the optimum separation set of the original features. The result of the reclassification is presented as follows ... [Pg.323]


See other pages where Linear discriminant analysis separation, classes is mentioned: [Pg.62]    [Pg.160]    [Pg.196]    [Pg.723]    [Pg.213]    [Pg.353]    [Pg.30]    [Pg.80]    [Pg.418]    [Pg.415]    [Pg.439]    [Pg.362]    [Pg.319]    [Pg.133]    [Pg.141]    [Pg.518]    [Pg.1317]    [Pg.2792]    [Pg.348]    [Pg.351]    [Pg.213]    [Pg.117]    [Pg.323]    [Pg.95]    [Pg.66]    [Pg.70]    [Pg.27]    [Pg.143]    [Pg.153]    [Pg.1317]    [Pg.498]    [Pg.498]    [Pg.300]   
See also in sourсe #XX -- [ Pg.180 ]




SEARCH



Class separations

Discriminant analysis

Discriminate analysis

Linear analysis

Linear discriminant analysis

Linear discriminate analysis

Linear discrimination analysis

Linearly separable

Separability linear

Separation analysis

© 2024 chempedia.info