Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Discriminate analysis

The previously mentioned data set with a total of 115 compounds has already been studied by other statistical methods such as Principal Component Analysis (PCA), Linear Discriminant Analysis, and the Partial Least Squares (PLS) method [39]. Thus, the choice and selection of descriptors has already been accomplished. [Pg.508]

Woodruff and co-workers introduced the expert system PAIRS [67], a program that is able to analyze IR spectra in the same manner as a spectroscopist would. Chalmers and co-workers [68] used an approach for automated interpretation of Fourier Transform Raman spectra of complex polymers. Andreev and Argirov developed the expert system EXPIRS [69] for the interpretation of IR spectra. EXPIRS provides a hierarchical organization of the characteristic groups that are recognized by peak detection in discrete ames. Penchev et al. [70] recently introduced a computer system that performs searches in spectral libraries and systematic analysis of mixture spectra. It is able to classify IR spectra with the aid of linear discriminant analysis, artificial neural networks, and the method of fe-nearest neighbors. [Pg.530]

Alternatives to Multiple Linear Regression Discriminant Analysis, Neural Networks and Classification Methods... [Pg.718]

Fig. 12.37 Discriminant analysis defines a discriminant function (dotted line) and a discriminant surface (solid) line. Fig. 12.37 Discriminant analysis defines a discriminant function (dotted line) and a discriminant surface (solid) line.
Discriminant emalysis is a supervised learning technique which uses classified dependent data. Here, the dependent data (y values) are not on a continuous scale but are divided into distinct classes. There are often just two classes (e.g. active/inactive soluble/not soluble yes/no), but more than two is also possible (e.g. high/medium/low 1/2/3/4). The simplest situation involves two variables and two classes, and the aim is to find a straight line that best separates the data into its classes (Figure 12.37). With more than two variables, the line becomes a hyperplane in the multidimensional variable space. Discriminant analysis is characterised by a discriminant function, which in the particular case of hnear discriminant analysis (the most popular variant) is written as a linear combination of the independent variables ... [Pg.719]

The surface that actually separates the classes is orthogonal to this discriminant function, as shown in Figure 12.37, and is chosen to maximise the number of compounds correctly classified. To use the results of a discriminant analysis, one simply calculates the appropriate value of the discriminant function, from which the class can be determined. [Pg.719]

We will explore the two major families of chemometric quantitative calibration techniques that are most commonly employed the Multiple Linear Regression (MLR) techniques, and the Factor-Based Techniques. Within each family, we will review the various methods commonly employed, learn how to develop and test calibrations, and how to use the calibrations to estimate, or predict, the properties of unknown samples. We will consider the advantages and limitations of each method as well as some of the tricks and pitfalls associated with their use. While our emphasis will be on quantitative analysis, we will also touch on how these techniques are used for qualitative analysis, classification, and discriminative analysis. [Pg.2]

Since that time thousands of QSARs, covering a wide and diverse range of end points, have been published [9] most of these have used MLR, but numerous other statistical techniques have also been used, such as partial least squares, principal component analysis, artificial neural networks, decision trees, and discriminant analysis [f4]. [Pg.472]

One of the best-known techniques for QSAR analysis of classification data is discriminant analysis [71,72]. If a single descriptor is adequate to discrimi-... [Pg.481]

The COMPACT (computer-optimized molecular parametric analysis of chemical toxicity) procedure, developed by Lewis and co-workers [92], uses a form of discriminant analysis based on two descriptors, namely, molecular planarity and electronic activation energy (the difference between the energies of the highest occupied and lowest unoccupied molecular orbitals), which predict the potential of a compound to act as a substrate for one of the cytochromes P450. Lewis et al. [93] found 64% correct predictions for 100 compounds tested by the NTP for mutagenicity. [Pg.484]

Worth AP, Cronin MTD. The use of discriminant analysis, logistic regression and classification tree analysis in the development of classification models for human health effects. J Mol Struct (Theochem) 2003 622 97-111. [Pg.492]

This approach did not seem to be as satisfactory for those sulfamates having heteroatom substituents (hetero-sulfamates). Spillane suggested that the various electronic effects of the hetero-atoms probably introduce an additional variable that is apparently absent, or constant, for the carbosulfamates. Because molecular connectivity correlates structure with molecular volume and electronic effects, Spillane included molecular connectivity, (computed for the entire molecule, RNHSOO to the four variables, x, y, z, and V, and applied the statistical technique of linear-discrimination analysis to 33 heterosulfamates (10 sweet, 23 not sweet). A correlation of >80% was obtained for the x, z, x subset 5 of the 33... [Pg.302]

Solberg and co-workers have applied discriminate analysis of clinical laboratory tests combined with careful clinical and anatomic diagnoses of liver disease in order to determine which combinations of the many dozen liver diagnostic tests available are the bes t ( ). These authors found that the measurement of GPT, GMT, GOT, ALP and ceruloplasmin were the most useful enzymatic tests, when combined with other non-enzymatic tests such as the measurement of bilirubin, cholesterol, hepatitis-B associated Australian antigen, etc. Another group of highly useful enzymes, not discussed in this review, are those clotting factors and the enzyme cholinesterase which are synthesized by the liver cells. [Pg.208]

Diagnosis of liver diseases by laboratory results and discriminant analysis. Identification of best combinations of laboratory tests. Scand. J. Clin. Lab. Invest. [Pg.222]

A first distinction which is often made is that between methods focusing on discrimination and those that are directed towards modelling classes. Most methods explicitly or implicitly try to find a boundary between classes. Some methods such as linear discriminant analysis (LDA, Sections 33.2.2 and 33.2.3) are designed to find explicit boundaries between classes while the k-nearest neighbours (A -NN, Section 33.2.4) method does this implicitly. Methods such as SIMCA (Section 33.2.7) put the emphasis more on similarity within a class than on discrimination between classes. Such methods are sometimes called disjoint class modelling methods. While the discrimination oriented methods build models based on all the classes concerned in the discrimination, the disjoint class modelling methods model each class separately. [Pg.208]

Fig. 33.1. Canonical variate plot for three classes with different thyroid status. The boundaries are obtained by linear discriminant analysis [2]. Fig. 33.1. Canonical variate plot for three classes with different thyroid status. The boundaries are obtained by linear discriminant analysis [2].
This classification problem can then be solved better by developing more suitable boundaries. For instance, using so-called quadratic discriminant analysis (QDA) (Section 33.2.3) or density methods (Section 33.2.5) leads to the boundaries of Fig. 33.2 and Fig. 33.3, respectively [3,4]. Other procedures that develop irregular boundaries are the nearest neighbour methods (Section 33.2.4) and neural nets (Section 33.2.9). [Pg.209]

We also make a distinction between parametric and non-parametric techniques. In the parametric techniques such as linear discriminant analysis, UNEQ and SIMCA, statistical parameters of the distribution of the objects are used in the derivation of the decision function (almost always a multivariate normal distribution... [Pg.212]

In the method of linear discriminant analysis, one therefore seeks a linear function of the variables, D, which maximizes the ratio between both variances. Geometrically, this means that we look for a line through the cloud of points, such that the projections of the points of the two groups are separated as much as possible. The approach is comparable to principal components, where one seeks a line that explains best the variation in the data (see Chapter 17). The principal component line and the discriminant function often more or less coincide (as is the case in Fig. 33.8a) but this is not necessarily so, as shown in Fig. 33.8b. [Pg.216]

So fiir, we have described only situations with two classes. The method can also be applied to K classes. It is then sometimes called descriptive linear discriminant analysis. In this case the weight vectors can be shown to be the eigenvectors of the matrix ... [Pg.220]

The Mahalanobis distance representation will help us to have a more general look at discriminant analysis. The multivariate normal distribution for w variables and class K can be described by... [Pg.221]

When all are considered equal, this means that they can be replaced by S, the pooled variance-covariance matrix, which is the case for linear discriminant analysis. The discrimination boundaries then are linear and is given by... [Pg.221]

Equation (33.10) is applied in what is called quadratic discriminant analysis (QDA). The equations can be shown to describe a quadratic boundary separating the regions where is minimal for the classes considered. [Pg.222]

The combination of PCA and LDA is often applied, in particular for ill-posed data (data where the number of variables exceeds the number of objects), e.g. Ref. [46], One first extracts a certain number of principal components, deleting the higher-order ones and thereby reducing to some degree the noise and then carries out the LDA. One should however be careful not to eliminate too many PCs, since in this way information important for the discrimination might be lost. A method in which both are merged in one step and which sometimes yields better results than the two-step procedure is reflected discriminant analysis. The Fourier transform is also sometimes used [14], and this is also the case for the wavelet transform (see Chapter 40) [13,16]. In that case, the information is included in the first few Fourier coefficients or in a restricted number of wavelet coefficients. [Pg.236]

Most of the supervised pattern recognition procedures permit the carrying out of stepwise selection, i.e. the selection first of the most important feature, then, of the second most important, etc. One way to do this is by prediction using e.g. cross-validation (see next section), i.e. we first select the variable that best classifies objects of known classification but that are not part of the training set, then the variable that most improves the classification already obtained with the first selected variable, etc. The results for the linear discriminant analysis of the EU/HYPER classification of Section 33.2.1 is that with all 5 or 4 variables a selectivity of 91.4% is obtained and for 3 or 2 variables 88.6% [2] as a measure of classification success. Selectivity is used here. It is applied in the sense of Chapter... [Pg.236]


See other pages where Discriminate analysis is mentioned: [Pg.511]    [Pg.720]    [Pg.293]    [Pg.148]    [Pg.482]    [Pg.145]    [Pg.145]    [Pg.85]    [Pg.97]    [Pg.105]    [Pg.105]    [Pg.434]    [Pg.84]    [Pg.140]    [Pg.207]    [Pg.210]    [Pg.213]    [Pg.219]    [Pg.220]    [Pg.223]   
See also in sourсe #XX -- [ Pg.134 , Pg.135 ]




SEARCH



Application of linear discriminant analysis

Canonical Discriminate Analysis

Canonical variates and linear discriminant analysis

Chemometrics linear discriminant analysis

Class separation, discriminant analysi

Control limit using discriminant analysis

Data analysis discriminant uncertainty

Descriptive linear discriminant analysis

Diagonal linear discriminant analysis

Discriminant analysis

Discriminant analysis

Discriminant analysis definition

Discriminant analysis density-based methods

Discriminant analysis distribution

Discriminant analysis error rate

Discriminant analysis factor

Discriminant analysis introduction

Discriminant analysis multivariate models

Discriminant analysis nonlinearity model

Discriminant analysis score

Discriminant analysis, pharmaceutical

Discriminant analysis, support vector

Discriminant function analysis

Discriminant function analysis , pattern recognition technique

Discriminant-regression analysis

Discriminate function analysis

Discriminate function analysis advantages

Discriminate function analysis classification functions

Discriminate function analysis software

Fault diagnosis using discriminant analysis

Fisher discriminant analysis

Fisher linear discriminant analysis

Fisher’s discriminant analysis

Linear discriminant analysis

Linear discriminant analysis canonical variate

Linear discriminant analysis covariance

Linear discriminant analysis covariance matrix

Linear discriminant analysis multiple classes

Linear discriminant analysis recognition techniques

Linear discriminant analysis separation, classes

Linear discriminant analysis structure

Linear discriminant analysis, classification

Linear discriminant function analysis

Linear discriminate analysis

Linear discrimination analysis

Multiple discriminant analysis

Multivariate statistical models Discriminant analysis

Multivariate statistical techniques discriminant analysis

Multivariate variance and discriminant analysis

NIRS discriminant analysis

PLS-discriminant analysis

Partial discriminant analysis

Partial least squares discriminant analysis

Partial least squares discriminant analysis , exploratory

Partial least squares discriminant analysis PLS-DA)

Partial least squares discriminate analysis

Partial least squares discriminate analysis PLS-DA)

Partial least squares-discriminant analysis classification

Partial least squares-discriminant analysis components

Partial least squares-discriminant analysis vectors, regression

Principal Component Linear Discriminant Analysis

Principal-component discriminant analysis

Pulse Height Analysis Discrimination of Photons

QSAR (quantitative structure-activity discriminant analysis

Quadratic discriminant analysis

Quadratic discriminant analysis and related methods

Qualitative discriminant analysis

Qualitative discriminant analysis applications

Reflected discriminant analysis

Robust linear discriminant analysis

Statistical analysis model discrimination

Stepwise discriminant analysis

Stepwise discrimination analysis

Supervised discriminant analysis

Supervised learning linear discriminant analysis

Supervised pattern recognition discriminant analysis

Training sets discriminant analysis

© 2024 chempedia.info