Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Quadratic discriminant

This classification problem can then be solved better by developing more suitable boundaries. For instance, using so-called quadratic discriminant analysis (QDA) (Section 33.2.3) or density methods (Section 33.2.5) leads to the boundaries of Fig. 33.2 and Fig. 33.3, respectively [3,4]. Other procedures that develop irregular boundaries are the nearest neighbour methods (Section 33.2.4) and neural nets (Section 33.2.9). [Pg.209]

Equation (33.10) is applied in what is called quadratic discriminant analysis (QDA). The equations can be shown to describe a quadratic boundary separating the regions where is minimal for the classes considered. [Pg.222]

SIMCA has inspired several related methods, such as DASCO [33] and CLASSY [34,35]. The latter has elements of the potential methods and SIMCA, while the former starts with the extraction of principal components, as in SIMCA, but then follows a quadratic discriminant rule. [Pg.232]

W. Wu, Y. Mallet, B. Walczak, W. Penninckx, D.L. Massart, S. Heuerding and F. Erni, Comparison of regularized discriminant analysis, linear discriminant analysis and quadratic discriminant analysis, applied to NIR data. Anal. Chim. Acta, 329 (1996) 257-265. [Pg.240]

Maximizing the posterior probabilities in case of multivariate normal densities will result in quadratic or linear discriminant rules. However, the mles are linear if we use the additional assumption that the covariance matrices of all groups are equal, i.e., X = = Xk=X- In this case, the classification rule is based on linear discriminant scores dj for groups j... [Pg.212]

The right plot in Figure 5.3 shows a linear discrimination of three groups. Here all three groups have the same prior probability, but their covariance matrices are not equal (different shape and orientation). The resulting mle is no longer optimal in the sense defined above. An optimal rule, however, could be obtained by quadratic discriminant analysis which does not require equality of the group covariances. [Pg.213]

S.J. Dixon and R.G. Brereton, Comparison of performance of five common classifiers represented as boundary methods Euclidean distance to centroids, linear discriminant analysis, quadratic discriminant analysis, learning vector quantization and support vector machines, as dependent on data structure, Chemom. Intell. Lab. Syst, 95, 1-17 (2009). [Pg.437]

Quadratic discriminant analysis (QDA) is a probabilistic parametric classification technique which represents an evolution of EDA for nonlinear class separations. Also QDA, like EDA, is based on the hypothesis that the probability density distributions are multivariate normal but, in this case, the dispersion is not the same for all of the categories. It follows that the categories differ for the position of their centroid and also for the variance-covariance matrix (different location and dispersion), as it is represented in Fig. 2.16A. Consequently, the ellipses of different categories differ not only for their position in the plane but also for eccentricity and axis orientation (Geisser, 1964). By coimecting the intersection points of each couple of corresponding ellipses (at the same Mahalanobis distance from the respective centroids), a parabolic delimiter is identified (see Fig. 2.16B). The name quadratic discriminant analysis is derived from this feature. [Pg.88]

In the case of the quadratic equation, the convergence condition for the "thermodynamic branch" series is simply positive discriminant (Passare and Tsikh, 2004). For kinetic polynomial (48) this discriminant is always positive for feasible values of parameters (see Equation (49)). This explains the convergence pattern for this series, in which the addition of new terms extended the convergence domain. [Pg.80]

The value of the discriminant for the equation x2 - Ax + 3 = 0 is positive, and we see that there are clearly two different roots, as indicated in plot (a) of Figure 2.23, which shows the curve cutting the x-axis at x = 1 and x = 3. The curve of the function y = x2 - 4x + 4, shown in plot (b), touches the x-axis at x = 2. In this case the discriminant is zero, and we have two equal roots, given byx = + /5 = 2 0. Note that although the curve only touches the x-axis in one place, the equation x2 - 4x+4 = 0 still has two roots they just happen to be identical. Finally, in the case of curve (c), there are no values of x corresponding to y = 0, indicating that there are no real roots of the quadratic equation y -Ax + 6 = 0, as the discriminant is equal to -8. [Pg.66]

Quasi-Newton methods may be used instead of our full Newton iteration. We have used the fast (quadratic) convergence rate of our Newton algorithm as a numerical check to discriminate between periodic and very slowly changing quasi-periodic trajectories the accurate computed elements of the Jacobian in a Newton iteration can be used in stability computations for the located periodic trajectories. There are deficiencies in the use of a full Newton algorithm, such as its sometimes small radius of convergence (Schwartz, 1983). Several other possibilities for continuation methods also exist (Doedel, 1986 Seydel and Hlavacek, 1986). The pseudo-arc length continuation was sufficient for our calculations. [Pg.246]

When the rs-catastrophe surface is viewed from the side along lines of constant a, the outline is given by the equality of the two roots of the quadratic (19), i.e. the discriminant must be zero. Setting D in (20) equal to zero gives ... [Pg.291]

Applying this criterion to the parabola y = x2, we look for the double zero of the quadratic x2 — mx — b. This corresponds to a vanishing value of the discriminant, and the result x = m/2 is independent of b. Rewriting this as... [Pg.112]

Discriminant analysis (DA) performs samples classification with an a priori hypothesis. This hypothesis is based on a previously determined TCA or other CA protocols. DA is also called "discriminant function analysis" and its natural extension is called MDA (multiple discriminant analysis), which sometimes is named "discriminant factor analysis" or CD A (canonical discriminant analysis). Among these type of analyses, linear discriminant analysis (LDA) has been largely used to enforce differences among samples classes. Another classification method is known as QDA (quadratic discriminant analysis) (Frank and Friedman, 1989) an extension of LDA and RDA (regularized discriminant analysis), which works better with various class distribution and in the case of high-dimensional data, being a compromise between LDA and QDA (Friedman, 1989). [Pg.94]

The raw and the standardized coordinates are calculated both manually and using software [STATISTICA, 1995], As in most cases where the reader of publications wants to reproduce the results, surprisingly we get a different result. In our case this is because most software (SPSS, STATISTICA,. ..) calculates a constant along with the raw coefficients. At the same time this demonstrates that there are several ways of finding discriminant functions. So, in some instances it may be convenient to use so-called elementary discriminant functions [AHRENS and LAUTER, 1981] or to try quadratic discrimination (see [FAHRMEIR and HAMERLE, 1984]). [Pg.192]

Discriminant classifiers. The two most important discriminant classifiers for material analysis using spectroscopic imaging systems are the Fisher linear discriminant classifier (FLDC) and the quadratic discriminant classifier (QDC). Other classfiers, such as the classical linear disriminant classifier (LDC), have frequently exhibited an inferior performance. [Pg.166]

In practice, p., X , and p. have to be estimated. Classical quadratic discriminant analysis (CQDA) uses the group s mean and empirical covariance matrix to estimate Pj and X. The membership probabilities are usually estimated by the relative frequencies of the observations in each group, hence pc = n. In, where nj is the number of observations in group j. [Pg.207]

Friedman and Frank [75] have shown that SIMCA is similar in form to quadratic discriminant analysis. The maximum-likelihood estimate of the inverse of the covariance matrix, which conveys information about the size, shape, and orientation of the data cloud for each class, is replaced by a principal component estimate. Because of the success of SIMCA, statisticians have recently investigated methods other than maximum likelihood to estimate the inverse of the covariance matrix, e.g., regularized discriminant analysis [76], For this reason, SIMCA is often viewed as the first successful attempt by scientists to develop robust procedures for carrying out statistical discriminant analysis on data sets where maximum-likelihood estimates fail because there are more features than samples in the data set. [Pg.354]

Several other functions such as the quadratic and regularised discriminant functions have been proposed and are suitable under certain circumstances. [Pg.242]

The theorem also applies to (18.7) once one shows ker(D) is connected, it follows that a nondegenerate quadratic form of even rank over a finite field is determined by its discriminant or Arf invariant. [Pg.157]


See other pages where Quadratic discriminant is mentioned: [Pg.210]    [Pg.220]    [Pg.228]    [Pg.10]    [Pg.624]    [Pg.160]    [Pg.153]    [Pg.83]    [Pg.160]    [Pg.88]    [Pg.55]    [Pg.45]    [Pg.13]    [Pg.256]    [Pg.169]    [Pg.207]    [Pg.208]    [Pg.209]    [Pg.166]    [Pg.701]    [Pg.701]   
See also in sourсe #XX -- [ Pg.52 ]

See also in sourсe #XX -- [ Pg.192 , Pg.207 , Pg.208 ]




SEARCH



Quadratic

© 2024 chempedia.info