Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Analysis quadratic discriminant

This classification problem can then be solved better by developing more suitable boundaries. For instance, using so-called quadratic discriminant analysis (QDA) (Section 33.2.3) or density methods (Section 33.2.5) leads to the boundaries of Fig. 33.2 and Fig. 33.3, respectively [3,4]. Other procedures that develop irregular boundaries are the nearest neighbour methods (Section 33.2.4) and neural nets (Section 33.2.9). [Pg.209]

Equation (33.10) is applied in what is called quadratic discriminant analysis (QDA). The equations can be shown to describe a quadratic boundary separating the regions where is minimal for the classes considered. [Pg.222]

W. Wu, Y. Mallet, B. Walczak, W. Penninckx, D.L. Massart, S. Heuerding and F. Erni, Comparison of regularized discriminant analysis, linear discriminant analysis and quadratic discriminant analysis, applied to NIR data. Anal. Chim. Acta, 329 (1996) 257-265. [Pg.240]

The right plot in Figure 5.3 shows a linear discrimination of three groups. Here all three groups have the same prior probability, but their covariance matrices are not equal (different shape and orientation). The resulting mle is no longer optimal in the sense defined above. An optimal rule, however, could be obtained by quadratic discriminant analysis which does not require equality of the group covariances. [Pg.213]

S.J. Dixon and R.G. Brereton, Comparison of performance of five common classifiers represented as boundary methods Euclidean distance to centroids, linear discriminant analysis, quadratic discriminant analysis, learning vector quantization and support vector machines, as dependent on data structure, Chemom. Intell. Lab. Syst, 95, 1-17 (2009). [Pg.437]

Quadratic discriminant analysis (QDA) is a probabilistic parametric classification technique which represents an evolution of EDA for nonlinear class separations. Also QDA, like EDA, is based on the hypothesis that the probability density distributions are multivariate normal but, in this case, the dispersion is not the same for all of the categories. It follows that the categories differ for the position of their centroid and also for the variance-covariance matrix (different location and dispersion), as it is represented in Fig. 2.16A. Consequently, the ellipses of different categories differ not only for their position in the plane but also for eccentricity and axis orientation (Geisser, 1964). By coimecting the intersection points of each couple of corresponding ellipses (at the same Mahalanobis distance from the respective centroids), a parabolic delimiter is identified (see Fig. 2.16B). The name quadratic discriminant analysis is derived from this feature. [Pg.88]

Discriminant analysis (DA) performs samples classification with an a priori hypothesis. This hypothesis is based on a previously determined TCA or other CA protocols. DA is also called "discriminant function analysis" and its natural extension is called MDA (multiple discriminant analysis), which sometimes is named "discriminant factor analysis" or CD A (canonical discriminant analysis). Among these type of analyses, linear discriminant analysis (LDA) has been largely used to enforce differences among samples classes. Another classification method is known as QDA (quadratic discriminant analysis) (Frank and Friedman, 1989) an extension of LDA and RDA (regularized discriminant analysis), which works better with various class distribution and in the case of high-dimensional data, being a compromise between LDA and QDA (Friedman, 1989). [Pg.94]

In practice, p., X , and p. have to be estimated. Classical quadratic discriminant analysis (CQDA) uses the group s mean and empirical covariance matrix to estimate Pj and X. The membership probabilities are usually estimated by the relative frequencies of the observations in each group, hence pc = n. In, where nj is the number of observations in group j. [Pg.207]

Friedman and Frank [75] have shown that SIMCA is similar in form to quadratic discriminant analysis. The maximum-likelihood estimate of the inverse of the covariance matrix, which conveys information about the size, shape, and orientation of the data cloud for each class, is replaced by a principal component estimate. Because of the success of SIMCA, statisticians have recently investigated methods other than maximum likelihood to estimate the inverse of the covariance matrix, e.g., regularized discriminant analysis [76], For this reason, SIMCA is often viewed as the first successful attempt by scientists to develop robust procedures for carrying out statistical discriminant analysis on data sets where maximum-likelihood estimates fail because there are more features than samples in the data set. [Pg.354]

The most popular classification methods are Linear Discriminant Analysis (LDA), Quadratic Discriminant Analysis (QDA), Regularized Discriminant Analysis (RDA), K th Nearest Neighbours (KNN), classification tree methods (such as CART), Soft-Independent Modeling of Class Analogy (SIMCA), potential function classifiers (PFC), Nearest Mean Classifier (NMC) and Weighted Nearest Mean Classifier (WNMC). Moreover, several classification methods can be found among the artificial neural networks. [Pg.60]

When using a linear method, such as LDA, the underlying assumption is that the two classes are linearly separable. This, of course, is generally not true. If linear separability is not possible, then with enough samples, the more powerful quadratic discriminant analysis (QDA) works better, because it allows the hypersurface that separates the classes to be curved (quadratic). Unfortunately, the clinical reality of small-sized data sets denies us this choice. [Pg.105]

Figure 15-6 Quadratic discriminant analysis removes the constraint of a straight and/or flat plane and allows more molding of the decision boundary than is possible with linear discriminant analysis. Figure 15-6 Quadratic discriminant analysis removes the constraint of a straight and/or flat plane and allows more molding of the decision boundary than is possible with linear discriminant analysis.
The most commonly used classification techniques are Linear Discriminant Analysis (LDA) and Quadratic Discriminant Analysis (QDA). They define a set of delimiters (according to the number of categories under smdy) in such a way that the multivariate space of the objects is divided into as many subspaces as the number of categories, and that each point of the space belongs to one... [Pg.231]

While Mahesh et al. (2011) used near-infrared hyperspectral images (wavelength range 960-1700 nm), applied to a bulk samples, to classify the moisture levels (12,14,16,18, and 20%) on the wheat. Principal components analysis (PCA) was used to identify the region (1260-1360 nm) with more information. The linear and quadratic discriminant analyses (LDA) and quadratic discriminant analysis (QDA) could classify the sample based on moisture contents than also identifying specific moisture levels with a god levels of accuracy (61- 100% in several case). Spectral features at key wavelengths of 1060, 1090, 1340, and 1450 nm were ranked at top in classifying wheat classes with different moisture contents. [Pg.241]

In LDA, it is assumed that the class covariance matrices are equal, that is, S j = S for all / = 1 to. Different class covariances are allowed in quadratic discriminant analysis (QDA). The results are quadratic class boundaries based on unbiased estimates of the covariance matrix. The most powerful method is based on regularized discriminant analysis (RDA) [7]. This method seeks biased estimates of the covariance matrices, Sy, to reduce their variances. This is done by introducing two regularization parameters A and Y according to... [Pg.192]

Along with CT emd LDA we tested the classification methods quadratic discriminant analysis (QDA), KNN, ANN with one, two, and three HN, as well as SVM with linear, polynomial (degree = 2), and radial kernel. The descriptor values were autoscaled for all these methods. The number of next neighbors to be considered, k, was determined via LOOCV for the various descriptor sets as follows ... [Pg.293]

If the distributions do not have similar shapes, then a modification of LDA known as quadratic discriminant analysis (QDA) may be used. This method assumes that the two groups have multivariate normal distributions but with different variances. [Pg.225]

The final detected polyps are obtained by application of a statistical classifier based on the image features to the differentiation of polyps from false positives. Investigators use parametric classifiers such as quadratic discriminant analysis (Yoshida and Nappi 2001), non-parametric classifiers such as artificial neural networks (Jerebko et al. 2003b Kiss et al. 2002 Nappi et al. 2004b), a committee of neural networks (Jerebko et al. 2003a), and a support vector machine (Gokturk et al. 2001). In principle, any combination of features and a classifier that provides a high classification performance should be sufficient for the differentiation task. [Pg.140]


See other pages where Analysis quadratic discriminant is mentioned: [Pg.210]    [Pg.220]    [Pg.88]    [Pg.207]    [Pg.701]    [Pg.416]    [Pg.417]    [Pg.478]    [Pg.293]    [Pg.464]    [Pg.238]    [Pg.143]    [Pg.133]    [Pg.1379]    [Pg.612]    [Pg.476]    [Pg.76]   
See also in sourсe #XX -- [ Pg.209 , Pg.210 , Pg.220 , Pg.228 ]

See also in sourсe #XX -- [ Pg.88 ]

See also in sourсe #XX -- [ Pg.701 ]

See also in sourсe #XX -- [ Pg.416 , Pg.417 ]

See also in sourсe #XX -- [ Pg.238 ]

See also in sourсe #XX -- [ Pg.209 ]

See also in sourсe #XX -- [ Pg.225 ]

See also in sourсe #XX -- [ Pg.65 , Pg.66 , Pg.67 , Pg.68 , Pg.69 ]




SEARCH



Discriminant analysis

Discriminate analysis

Quadratic

Quadratic analysis

© 2024 chempedia.info