Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Discriminate function analysis classification functions

Discriminant analysis (DA) performs samples classification with an a priori hypothesis. This hypothesis is based on a previously determined TCA or other CA protocols. DA is also called "discriminant function analysis" and its natural extension is called MDA (multiple discriminant analysis), which sometimes is named "discriminant factor analysis" or CD A (canonical discriminant analysis). Among these type of analyses, linear discriminant analysis (LDA) has been largely used to enforce differences among samples classes. Another classification method is known as QDA (quadratic discriminant analysis) (Frank and Friedman, 1989) an extension of LDA and RDA (regularized discriminant analysis), which works better with various class distribution and in the case of high-dimensional data, being a compromise between LDA and QDA (Friedman, 1989). [Pg.94]

The principle of multivariate analysis of variance and discriminant analysis (MVDA) consists in testing the differences between a priori classes (MANOVA) and their maximum separation by modeling (MDA). The variance between the classes will be maximized and the variance within the classes will be minimized by simultaneous consideration of all observed features. The classification of new objects into the a priori classes, i.e. the reclassification of the learning data set of the objects, takes place according to the values of discriminant functions. These discriminant functions are linear combinations of the optimum set of the original features for class separation. The mathematical fundamentals of the MVDA are explained in Section 5.6. [Pg.332]

The discriminate function analysis also yields classification functions for each variable (Fe, Pb, or Ni) within each group (TC, MB, TB, NMDB, KVDB, NMSB, and KVSB) and a constant for each group. Once known, the classification functions can be used to classify each of the original sherds into one of the seven possible groups. The classification matrix, obtained by treating data from the 32 original sherds with the classification functions, is given as Table VI. [Pg.138]

Cluster analysis. In this section, the application of the simple multiscale approach to cluster analysis is demonstrated. The masking method will also be used to localise important features. There are several possible cluster analysis algorithms, however only discriminant function analysis (DFA) will be used here. Before discussing the results from the simple multiscale analysis, this section will first present DFA and how it can applied to both unsupervised and supervised classification, followed by how the cluster properties S are measured at each resolution level. [Pg.391]

Discriminant Function Analysis was used as classifier, with 5 features (5 sensor conductance values) and 63 observations collected within a 22- month s period. The F-ratio of intergroup/intragroup variances was chosen as classification performance criterion. [Pg.130]

PLS (Partial Least Squares) regression was used for quantification and classification of aristeromycin and neplanocin A (Figure 4). Matlab was used for PCA (Principal Components Analysis) (according to the NIPALS algorithm) to identify correlations amongst the variables from the 882 wavenumbers and reduce the number of inputs for Discriminant Function Analysis (DFA) (first 15 PCA scores used) (Figure 5). [Pg.188]

Another classification method is the discriminant function analysis, which instead of regression as the mathematical framework is based on the same principle as the MANOVA. Whereas the MANOVA deals with whether a number of groups differ significantly with respect to differences on a number of dependent variables, the discriminant function analysis deals with whether a linear combination of predictor... [Pg.384]

Discriminant function analysis (DFA) identifies a small set of elements, with which we will create four equations known as classification functions. The DFA statistical method generates four equations (classification functions) one for each of our four Pakistani segments. Each classification function comprises an additive constant, and four weights or coefficients, one coefficient for each element. [Pg.534]

Fig. 8.11. Representation of classification of 88 German wines by discriminant analysis (1st vs 2nd discriminant function) o Bad Diirkheim (Rhinelande-Palatinate)... [Pg.262]

Discriminant Analysis (DA) is a multivariate statistical method that generates a set of classification functions that can be used to predict into which of two or more categories an observation is most likely to fall, based on a certain combination of input variables. DA may be more effective than regression for relating groundwater age to major ion hydrochemistry and well construction because it can account for complex, non-continuous relationships between age and each individual variable used in the algorithm while inherently coping with uncertainty in the age values used for... [Pg.76]

Replicate samples were also excluded to avoid artificially increasing confidence in the classification functions. The subtraction of these seven replicate samples reduced the data set to 60 samples. For the interest of space and clarity, observations and elements excluded from DFA have been omitted from Tables III-V, but may be obtained from the primary author. After excluding limiting the data set to TMs and REEs common to hematite and without missing values, a subcomposition of 11 elements remained. This data set of 11 elements (Fe, Sc, Ti, V, Cr, Mn, Co, Sb, La, Sm, and Th) was transformed into 10 log-ratios for the remaining 60 non-replicate samples for discriminant analysis. The TM and REE model included all 10 log-ratios while the TM model excluded the log-ratios of Sb, La, Sm, and Th. Comparison of these two models... [Pg.468]

Linear discriminant analysis (LDA) is also a probabilistic classifier in the mold of Bayes algorithms but can be related closely to both regression and PCA techniques. A discriminant function is simply a function of the observed vector of variables (K) that leads to a classification rule. The likelihood ratio (above), for example, is an optimal discriminant for the two-class case. Hence, the classification rule can be stated as... [Pg.196]

As in factor analysis, the discriminant feature space may have a lower dimension ndf than the original feature space. With respect to the classification into a certain number of classes the following number of discriminant functions is necessary ... [Pg.187]

The result from cluster analysis presented in Fig. 9-2 is subjected to MVDA (for mathematical fundamentals see Section 5.6 or [AHRENS and LAUTER, 1981]). The principle of MVDA is the separation of predicted classes of objects (sampling points). In simultaneous consideration of all the features observed (heavy metal content), the variance of the discriminant functions is maximized between the classes and minimized within them. The classification of new objects into a priori classes or the reclassification of the learning data set is carried out using the values of the discriminant function. These values represent linear combinations of the optimum separation set of the original features. The result of the reclassification is presented as follows ... [Pg.323]

Modeling by using the structure contained within a data set is a problem-oriented process, and as such it is fundamentally opposed to notions of a static mode of group formation. In the recent archaeometric literature, it was proposed that, as a convenient method of communication between researchers, the classification functions derived from discriminant analysis could be transmitted rather than the actual data (11) Such a proposal seems to be insensitive to problem orientation, ceramic processes, or statistical influence. [Pg.87]

This supervised classification method, which is the most used, accepts a normal multivariate distribution for the variables in each population ((Ai,..., A ) Xi) ), and calculates the classification functions minimising the possibility of incorrect classification of the observations of the training group (Bayesian type rule). If multivariate normality is accepted and equality of the k covariance matrices ((Ai,..., Xp) NCfti, X)), Linear Discriminant Analysis (LDA) calculates... [Pg.701]

Equations (25) are linear with respect to x and this classification technique is referred to as /inear discriminant analysis, with the discriminant function obtained by least squares analysis, analogous to multiple regression analysis. [Pg.134]

The linear discriminant function is a most commonly used classification technique and it is available with all the most popular statistical software packages. It should be borne in mind, however, that it is only a simplification of the Bayes classifier and assumes that the variates are obtained from a multivariate normal distribution and that the groups have similar covariance matrices. If these conditions do not hold then the linear discriminant function should be used with care and the results obtained subject to careful analysis. [Pg.138]

Using discriminant analysis, the following non-linear classification functions were generated in Sa space also allowing discrimination between active and inactive sets. However, the parabolic functions in So were no more statistically significant than the functions in SR and classification was still only 73 percent correct, missing two of the active set. (19)... [Pg.184]

The adaptive least squares (ALS) method [396, 585 — 588] is a modification of discriminant analysis which separates several activity classes e.g. data ordered by a rating score) by a single discriminant function. The method has been compared with ordinary regression analysis, linear discriminant analysis, and other multivariate statistical approaches in most cases the ALS approach was found to be superior to categorize any numbers of classes of ordered data. ORMUCS (ordered multicate-gorial classification using simplex technique) [589] is an ALS-related approach which... [Pg.100]

Using several modifications discriminant analysis and related statistical classification methods have been widely and successfidly applied in the QSAR field by several workers [75-89, 91-104]. In our work we have always preferred to use discriminant analysis (or another adequate classification method) for data of low or unknown precision rather than to force such data into a regression model. A discriminant function (in multi-class problems, the first discriminant function) can often be interpreted analogously to a Hansch equation in mechanistic terms. [Pg.69]

Figure 5 Classification of 88 white wine samples of three varieties from five producers according to vintage by linear discriminant analysis - plot in the coordinates of two main discriminant functions (DF2 versus DF1) composed of 19 original variables (concentrations of volatile, aroma creating compounds). Explanation of symbols O denote the 1996 samples, x the 1997 samples, -i- the 1998 samples. Probability ellipses express the 95% probability level. (Reproduced with permission from Petka J, Mocak J, Farkas P, Balia B, and Kovac M (2001) Classification of Siovak varietal white wines by volatile compounds. Journal of the Science of Food and Agriculture 81 1533-1539 John Wiley Sons Ltd.)... Figure 5 Classification of 88 white wine samples of three varieties from five producers according to vintage by linear discriminant analysis - plot in the coordinates of two main discriminant functions (DF2 versus DF1) composed of 19 original variables (concentrations of volatile, aroma creating compounds). Explanation of symbols O denote the 1996 samples, x the 1997 samples, -i- the 1998 samples. Probability ellipses express the 95% probability level. (Reproduced with permission from Petka J, Mocak J, Farkas P, Balia B, and Kovac M (2001) Classification of Siovak varietal white wines by volatile compounds. Journal of the Science of Food and Agriculture 81 1533-1539 John Wiley Sons Ltd.)...

See other pages where Discriminate function analysis classification functions is mentioned: [Pg.327]    [Pg.466]    [Pg.317]    [Pg.138]    [Pg.356]    [Pg.105]    [Pg.406]    [Pg.107]    [Pg.272]    [Pg.295]    [Pg.169]    [Pg.139]    [Pg.705]    [Pg.83]    [Pg.260]    [Pg.323]    [Pg.91]    [Pg.203]    [Pg.203]    [Pg.69]    [Pg.133]    [Pg.81]    [Pg.84]    [Pg.106]   
See also in sourсe #XX -- [ Pg.134 , Pg.136 ]




SEARCH



Classification analysis

Discriminant analysis

Discriminant function analysis

Discriminate analysis

Discriminate function

Discriminative functionalization

Functional analysis

Functional classification

Functions analysis

© 2024 chempedia.info