Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Pattern recognition, supervised

There are a large number of mediods for supervised pattern recognition, mostly aimed at classification. Multivariate statisticians have developed many discriminant functions, some of direct relevance to chemists. A classical application is the detection of forgery of banknotes. Can physical measurements such as width and height of a series of banknotes be used to identify forgeries Often one measurement is not enough, so several parameters are required before an adequate mathematical model is available. [Pg.184]

So in chemistry, similar problems occur. Consider using a chemical method such as IR spectroscopy to determine whether a sample of brain tissue is cancerous or not. A method can be set up in which the spectra of two groups, cancerous and noncancerous tissues, are recorded. Then some form of mathematical model is set up. Finally, the diagnosis of an unknown sample can be predicted. [Pg.184]

Section 4.5 describes a variety of such techniques and their applications. [Pg.184]


Mathematically, this means that one needs to assign portions of an 8-dimensionaI space to the three classes. A new sample is then assigned to the class which occupies the portion of space in which the sample is located. Supervised pattern recognition is distinct from unsupervised pattern recognition. In the latter one applies essentially clustering methods (Chapter 30) to classify objects into classes that are not known beforehand. In supervised pattern recognition, one knows the classes and has to decide in which of those an object should be classified. [Pg.207]

Supervised pattern recognition techniques essentially consist of the following steps. [Pg.207]

This is the simplest possible type of neuron, used here for didactic purposes and not because it is the configuration to be recommended. Let us suppose that for this isolated neuron w, = 1, Wj = 2 and 7=1. The line in Fig. 33.20 then gives the values of x, and Xj for which E = 7. All combinations of x, and Xj on and above the line will yield E > 7 and therefore lead to an output y, = 1 (i.e. the object is class K), all combinations below it toy, = 0. The procedure described here is equivalent to a method called the linear learning machine, which was one of the first supervised pattern recognition methods to be applied in chemometrics. It is further explained, including the training phase, in Chapter 44. [Pg.234]

Neurons are not used alone, but in networks in which they constitute layers. In Fig. 33.21 a two-layer network is shown. In the first layer two neurons are linked each to two inputs, x, and X2- The upper one is the one we already described, the lower one has w, = 2, W2 = 1 and also 7= 1. It is easy to understand that for this neuron, the output )>2 is 1 on and above line b in Fig. 33.22a and 0 below it. The outputs of the neurons now serve as inputs to a third neuron, constituting a second layer. Both have weight 0.5 and 7 for this neuron is 0.75. The output yfi j, of this neuron is 1 if E = 0.5 y, + 0.5 y2 > 0.75 and 0 otherwise. Since y, and y2 have as possible values 0 and 1, the condition for 7 > 0.75 is fulfilled only when both are equal to 1, i.e. in the dashed area of Fig. 33.22b. The boundary obtained is now no longer straight, but consists of two pieces. This network is only a simple demonstration network. Real networks have many more nodes and transfer functions are usually non-linear and it will be intuitively clear that boundaries of a very complex nature can be developed. How to do this, and applications of supervised pattern recognition are described in detail in Chapter 44 but it should be stated here that excellent results can be obtained. [Pg.234]

Most of the supervised pattern recognition procedures permit the carrying out of stepwise selection, i.e. the selection first of the most important feature, then, of the second most important, etc. One way to do this is by prediction using e.g. cross-validation (see next section), i.e. we first select the variable that best classifies objects of known classification but that are not part of the training set, then the variable that most improves the classification already obtained with the first selected variable, etc. The results for the linear discriminant analysis of the EU/HYPER classification of Section 33.2.1 is that with all 5 or 4 variables a selectivity of 91.4% is obtained and for 3 or 2 variables 88.6% [2] as a measure of classification success. Selectivity is used here. It is applied in the sense of Chapter... [Pg.236]

D. Coomans and D.L. Massart, Alternative K-nearest neighbour rules in supervised pattern recognition. Part 2. Probabilistic classification on the basis of the kNN method modified for direct density estimation. Anal. Chim. Acta, 138 (1982) 153-165. [Pg.240]

J.D.F. Habbema, Some useful extensions of the standard model for probabilistic supervised pattern recognition. Anal. Chim. Acta, 150 (1983) 1-10. [Pg.240]

Classical supervised pattern recognition methods include /( -nearest neighbor (KNN) and soft independent modeling of class analogies (SIMCA). Both... [Pg.112]

In contras to unsupervised methods, supervised pattern-recognition methods (Section 4.3) use class membership information in the calculations. The goal of these methods is to construct models using analytical measurements to predict class membership of future samples. Class location and sometimes shape are used in the calibration step to construct the models. In prediction, these moddsare applied to the analytical measurements of unknowu samples to predict dsss membership. [Pg.36]

Supervised pattern recognition methods are used for predicting the class of unkno-wm samples given a training set of samples with known class member-sliip. Tvksmethods are discussed in Section 4.3, KNN and SIMCA,... [Pg.95]

Pattern Recognition A process of examining the relationships between samples and/or variables in a data set. Unsupervised pattern-recognition tools are used to determine if there are groupings of similar samples in a data set. Supervised pattern-recognition tools are used to classify unknown samples as more likely of type A or type B. [Pg.187]

The goal of unsupervised techniques is to identify and display natural groupings in the data without imposing any prior class membership. Even when the ultimate goal of the project is to develop a supervised pattern recognition model, we recommend the use of unsupervised techniques to provide an initial view of the data. [Pg.239]

Current methods for supervised pattern recognition are numerous. Typical linear methods are linear discriminant analysis (LDA) based on distance calculation, soft independent modeling of class analogy (SIMCA), which emphasizes similarities within a class, and PLS discriminant analysis (PLS-DA), which performs regression between spectra and class memberships. More advanced methods are based on nonlinear techniques, such as neural networks. Parametric versus nonparametric computations is a further distinction. In parametric techniques such as LDA, statistical parameters of normal sample distribution are used in the decision rules. Such restrictions do not influence nonparametric methods such as SIMCA, which perform more efficiently on NIR data collections. [Pg.398]

Reference spectra choice is critical when applying supervised pattern recognition methods. The first solution is to use pure compound spectra as references. The drawback is that mixture spectra in data cubes often differ from the reference spectra. Applying the model may therefore give wrong results. The second solution, suitable in a few studies, is to select image pixels where only one compound is present in order to obtain the calibration sets. [Pg.419]

Roggo, Y., Duponchel, L., and Huvenne, J.-P. (2003), Comparison of supervised pattern recognition methods with McNemar s statistical test Application to qualitative analysis of sugar beet by near-infrared spectroscopy, Anal. Chim. Acta, All, 187-200. [Pg.430]

M. Alvarez, I. M. Moreno, A. Jos, A. M. Camean and A. G. Gonzalez, Differentiation of two Andalusian DO fino wines according to their metal content from ICP-OES by using supervised pattern recognition methods, Microchem. J., 87(1), 2007, 72-76. [Pg.281]

In order to characterise wine samples into the mentioned four classes, a supervised pattern recognition method (LDA) was applied. The results obtained gave 100% correct classification for the three classes (Barbera Oltrepo, Barbera Piemonte and Barbera Alba) and only one Barbera Asti sample was not correctly classified (cross-validation error rate 1.89%). [Pg.769]

Derde, M.P. and Massart, D.L. (1986) Supervised pattern recognition The ideal method Anal. Chim. Acta 191, 1-16. [Pg.385]

Questions of type (2.1) may be answered by analysis of variance or by discriminant analysis. All these methods may be found under the name supervised learning or supervised pattern recognition methods. In the sense of question (2.1.3) one may speak of supervised classification or even better of re-classification methods. In situations of type (2.2) methods from the large family of regression methods are appropriate. [Pg.16]

PARVUS Elsevier Scientific Software, P. O. Box 211, NL-1000AE Amsterdam, The Netherlands Dfl 1325. Package for supervised pattern recognition and handling of multivariate data (ref. 19). [Pg.63]

CONTENTS 1. Chemometrics and the Analytical Process. 2. Precision and Accuracy. 3. Evaluation of Precision and Accuracy. Comparison of Two Procedures. 4. Evaluation of Sources of Variation in Data. Analysis of Variance. 5. Calibration. 6. Reliability and Drift. 7. Sensitivity and Limit of Detection. 8. Selectivity and Specificity. 9. Information. 10. Costs. 11. The Time Constant. 12. Signals and Data. 13. Regression Methods. 14. Correlation Methods. 15. Signal Processing. 16. Response Surfaces and Models. 17. Exploration of Response Surfaces. 18. Optimization of Analytical Chemical Methods. 19. Optimization of Chromatographic Methods. 20. The Multivariate Approach. 21. Principal Components and Factor Analysis. 22. Clustering Techniques. 23. Supervised Pattern Recognition. 24. Decisions in the Analytical Laboratory. [Pg.215]


See other pages where Pattern recognition, supervised is mentioned: [Pg.44]    [Pg.3]    [Pg.80]    [Pg.84]    [Pg.207]    [Pg.207]    [Pg.210]    [Pg.232]    [Pg.692]    [Pg.60]    [Pg.95]    [Pg.234]    [Pg.419]    [Pg.762]    [Pg.73]    [Pg.184]    [Pg.184]    [Pg.227]    [Pg.230]    [Pg.230]   
See also in sourсe #XX -- [ Pg.207 ]

See also in sourсe #XX -- [ Pg.184 ]

See also in sourсe #XX -- [ Pg.155 ]




SEARCH



Pattern recognition

Supervised

© 2024 chempedia.info