Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Quadratic discriminant method

This classification problem can then be solved better by developing more suitable boundaries. For instance, using so-called quadratic discriminant analysis (QDA) (Section 33.2.3) or density methods (Section 33.2.5) leads to the boundaries of Fig. 33.2 and Fig. 33.3, respectively [3,4]. Other procedures that develop irregular boundaries are the nearest neighbour methods (Section 33.2.4) and neural nets (Section 33.2.9). [Pg.209]

SIMCA has inspired several related methods, such as DASCO [33] and CLASSY [34,35]. The latter has elements of the potential methods and SIMCA, while the former starts with the extraction of principal components, as in SIMCA, but then follows a quadratic discriminant rule. [Pg.232]

S.J. Dixon and R.G. Brereton, Comparison of performance of five common classifiers represented as boundary methods Euclidean distance to centroids, linear discriminant analysis, quadratic discriminant analysis, learning vector quantization and support vector machines, as dependent on data structure, Chemom. Intell. Lab. Syst, 95, 1-17 (2009). [Pg.437]

Discriminant analysis (DA) performs samples classification with an a priori hypothesis. This hypothesis is based on a previously determined TCA or other CA protocols. DA is also called "discriminant function analysis" and its natural extension is called MDA (multiple discriminant analysis), which sometimes is named "discriminant factor analysis" or CD A (canonical discriminant analysis). Among these type of analyses, linear discriminant analysis (LDA) has been largely used to enforce differences among samples classes. Another classification method is known as QDA (quadratic discriminant analysis) (Frank and Friedman, 1989) an extension of LDA and RDA (regularized discriminant analysis), which works better with various class distribution and in the case of high-dimensional data, being a compromise between LDA and QDA (Friedman, 1989). [Pg.94]

Friedman and Frank [75] have shown that SIMCA is similar in form to quadratic discriminant analysis. The maximum-likelihood estimate of the inverse of the covariance matrix, which conveys information about the size, shape, and orientation of the data cloud for each class, is replaced by a principal component estimate. Because of the success of SIMCA, statisticians have recently investigated methods other than maximum likelihood to estimate the inverse of the covariance matrix, e.g., regularized discriminant analysis [76], For this reason, SIMCA is often viewed as the first successful attempt by scientists to develop robust procedures for carrying out statistical discriminant analysis on data sets where maximum-likelihood estimates fail because there are more features than samples in the data set. [Pg.354]

The most popular classification methods are Linear Discriminant Analysis (LDA), Quadratic Discriminant Analysis (QDA), Regularized Discriminant Analysis (RDA), K th Nearest Neighbours (KNN), classification tree methods (such as CART), Soft-Independent Modeling of Class Analogy (SIMCA), potential function classifiers (PFC), Nearest Mean Classifier (NMC) and Weighted Nearest Mean Classifier (WNMC). Moreover, several classification methods can be found among the artificial neural networks. [Pg.60]

When using a linear method, such as LDA, the underlying assumption is that the two classes are linearly separable. This, of course, is generally not true. If linear separability is not possible, then with enough samples, the more powerful quadratic discriminant analysis (QDA) works better, because it allows the hypersurface that separates the classes to be curved (quadratic). Unfortunately, the clinical reality of small-sized data sets denies us this choice. [Pg.105]

In LDA, it is assumed that the class covariance matrices are equal, that is, S j = S for all / = 1 to. Different class covariances are allowed in quadratic discriminant analysis (QDA). The results are quadratic class boundaries based on unbiased estimates of the covariance matrix. The most powerful method is based on regularized discriminant analysis (RDA) [7]. This method seeks biased estimates of the covariance matrices, Sy, to reduce their variances. This is done by introducing two regularization parameters A and Y according to... [Pg.192]

Along with CT emd LDA we tested the classification methods quadratic discriminant analysis (QDA), KNN, ANN with one, two, and three HN, as well as SVM with linear, polynomial (degree = 2), and radial kernel. The descriptor values were autoscaled for all these methods. The number of next neighbors to be considered, k, was determined via LOOCV for the various descriptor sets as follows ... [Pg.293]

If the distributions do not have similar shapes, then a modification of LDA known as quadratic discriminant analysis (QDA) may be used. This method assumes that the two groups have multivariate normal distributions but with different variances. [Pg.225]

The literature of multivariate classification shows that several types of methods have found utility in application to chemical problems. Excellent discussions of the major methods can be found in Strouf ° and Tou and Gon-zalez. The most frequently used methods include parametric approaches involving linear and quadratic discriminant analysis based on the Bayesian approach,nonparametric linear discriminant development methods,and those methods based on principal components analysis such as SIMCA (Soft Independent Modeling by Class Analogy). [Pg.183]

Quasi-Newton methods may be used instead of our full Newton iteration. We have used the fast (quadratic) convergence rate of our Newton algorithm as a numerical check to discriminate between periodic and very slowly changing quasi-periodic trajectories the accurate computed elements of the Jacobian in a Newton iteration can be used in stability computations for the located periodic trajectories. There are deficiencies in the use of a full Newton algorithm, such as its sometimes small radius of convergence (Schwartz, 1983). Several other possibilities for continuation methods also exist (Doedel, 1986 Seydel and Hlavacek, 1986). The pseudo-arc length continuation was sufficient for our calculations. [Pg.246]

This mechanistic question is one of the examples of the success of density functional theory (DFT) methods for metal-organic chemistry. Earher work on the reaction mechanism could not discriminate between the two alternatives. Analysis of the different orbitals based on Extended Hueckel theory (EHT) calculations led to the conclusion that the (3+2) pathway is more likely, but the authors could not exclude the possibility of a (2+2) pathway [4] similar to the results of HE calculations in combination with Quadratic configuration interaction (QCISD(T)) single points [32]. [Pg.146]

Many functions have interesting, highly exploitable features (e.g., continuity and derivability). Specifically, many can be well approximated by means of quadratic functions as their minimum is approached. Conversely, the Fibonacci method does not discriminate between functions and takes all of them in the same way the worst one. [Pg.53]


See other pages where Quadratic discriminant method is mentioned: [Pg.1278]    [Pg.1278]    [Pg.220]    [Pg.160]    [Pg.208]    [Pg.701]    [Pg.416]    [Pg.395]    [Pg.293]    [Pg.102]    [Pg.76]    [Pg.269]    [Pg.190]    [Pg.10]    [Pg.624]    [Pg.160]    [Pg.169]    [Pg.166]    [Pg.560]    [Pg.572]    [Pg.183]    [Pg.188]   
See also in sourсe #XX -- [ Pg.1278 ]




SEARCH



Discriminant methods

Discrimination method

Quadratic

© 2024 chempedia.info