Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Fisher linear discriminant analysis

Discriminant classifiers. The two most important discriminant classifiers for material analysis using spectroscopic imaging systems are the Fisher linear discriminant classifier (FLDC) and the quadratic discriminant classifier (QDC). Other classfiers, such as the classical linear disriminant classifier (LDC), have frequently exhibited an inferior performance. [Pg.166]

Figure 15-5 Fisher s linear discriminant analysis determines the line, plane, or hyperplane that best separates two populations, based on their mean values and the variance, in the graph, a line bisecting the two curves In the lower left corner provides this best discriminator. Figure 15-5 Fisher s linear discriminant analysis determines the line, plane, or hyperplane that best separates two populations, based on their mean values and the variance, in the graph, a line bisecting the two curves In the lower left corner provides this best discriminator.
Discriminant plots were obtained for the adaptive wavelet coefficients which produced the results in Table 2. Although the classifier used in the AWA was BLDA, it was decided to supply the coefficients available upon termination of the AWA to Fisher s linear discriminant analysis, so we could visualize the spatial separation between the classes. The discriminant plots are produced using the testing data only. There is a good deal of separation for the seagrass data (Fig. 5), while for the paraxylene data (Fig. 6) there is some overlap between the objects of class I and 3. Quite clearly, the butanol data (Fig. 7) post a challenge in discriminating between the two classes. [Pg.447]

Fig. 5 Discriminant plots for the seagrass data produced by supplying the coefficients resulting from the AW A to Fisher s linear discriminant analysis. Fig. 5 Discriminant plots for the seagrass data produced by supplying the coefficients resulting from the AW A to Fisher s linear discriminant analysis.
A more formal way of finding a decision boundary between different classes is based on linear discriminant analysis (LDA) as introduced by Fisher and Mahalanobis. The boundary or hyperplane is calculated such that the variance between the classes is maximized and the variance within the individual classes is minimized. There are several ways to arrive at the decision hyperplanes. In... [Pg.186]

There are a number of classification methods for analyzing data, including artificial neural (ANNs see Beale and Jackson, 1990) networks, -nearest-neighbor (fe-NN) methods, decision trees, support vector machines (SVMs), and Fisher s linear discriminant analysis (LDA). Among these methods, a decision tree is a flow-chart-like tree stmcture. An intermediate node denotes a test on a predictive attribute, and a branch represents an outcome of the test. A terminal node denotes class distribution. [Pg.129]

Linear discriminant analysis (LDA), originally proposed by Fisher in 1936 [8], is the oldest and most studied supervised pattern recognition method. As the name suggests, it is a linear technique, that is the decision boundaries separating the classes in the multidimensional space of the variables are linear surfaces (hyperplanes). From a probabilistic standpoint, it is a parametric method, as its underlying hypothesis is that, for each category, the data follow a multivariate normal distribution. This means that the likelihood in Equation (2), for each class, is defined as... [Pg.192]

Supervised and unsupervised classification for example PCA, K-means and fuzzy clustering, linear discriminant analysis (LDA), partial least squares-discriminant analysis (PLS-DA), fisher discriminant analysis (FDA), artificial neural networks (ANN). [Pg.361]

Most traditional approaches to classification in science are called discriminant analysis and are often also called forms of hard modelling . The majority of statistically based software packages such as SAS, BMDP and SPSS contain substantial numbers of procedures, referred to by various names such as linear (or Fisher) discriminant analysis and canonical variates analysis. There is a substantial statistical literature in this area. [Pg.233]

LS-SVMlab, http //www.esat.kuleuven.ac.be/sista/lssvmlab/. LS-SVMlab, by Suykens, is a MATLAB implementation of least-squares support vector machines (LS-SVMs), a reformulation of the standard SVM that leads to solving linear KKT systems. LS-SVM primal-dual formulations have been formulated for kernel PCA, kernel CCA, and kernel PLS, thereby extending the class of primal-dual kernel machines. Links between kernel versions of classic pattern recognition algorithms such as kernel Fisher discriminant analysis and extensions to unsupervised learning, recurrent networks, and control are available. [Pg.390]


See other pages where Fisher linear discriminant analysis is mentioned: [Pg.330]    [Pg.575]    [Pg.439]    [Pg.330]    [Pg.575]    [Pg.439]    [Pg.208]    [Pg.478]    [Pg.92]    [Pg.418]    [Pg.48]    [Pg.69]    [Pg.143]    [Pg.133]    [Pg.218]    [Pg.214]    [Pg.213]    [Pg.417]    [Pg.186]    [Pg.143]    [Pg.156]    [Pg.188]    [Pg.391]   
See also in sourсe #XX -- [ Pg.330 ]




SEARCH



Discriminant analysis

Discriminate analysis

Fisher 1

Fisher discriminant analysis

Linear analysis

Linear discriminant analysis

Linear discriminate analysis

Linear discrimination analysis

© 2024 chempedia.info