Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Pattern classification

Consider a group of linear classifiers (hyperplanes) defined by a set of pairs (w, b) that satisfy the following inequalities for any pattern x, in the training set  [Pg.304]

In general, for each linearly separable training set, one can find an infinite number of hyperplanes that discriminate the two classes of patterns. Although all these linear classifiers can perfectly separate the learning patterns, they are not all identical. Indeed, their prediction capabilities are different. A hyperplane situated in the proximity of the border +1 patterns will predict as —1 all new -hi patterns that are situated close to the separation hyperplane but in the —1 region (w x- -fi 0). Conversely, a hyperplane situated in the proximity of the border —1 patterns will predict as -hi all new —1 patterns situated close to the separation hyperplane but in the - -1 region (w x -F 0). It is clear that such classifiers have little prediction success, which led to the idea [Pg.305]


R. O. Duda and P. E. Hart, Pattern Classification and Scene Analysis, Wiley-Interscience, New York, 1973. [Pg.431]

The model induced via the decision tree is not a blackbox, and provides explicit and interpretable rules for solving the pattern classification problem. The most relevant variables are also clearly identified. For example, for the data in Table I, the value of the temperature are not necessary for obtaining good or bad quality, as is clearly indicated by the decision tree in Fig. 22. [Pg.263]

J.R.M. Smits, L.W. Breedveld, M.W.J. Derksen, G. Kateman, H.W. Balfoort, J. Snoek and J.W. Hofstraat, Pattern classification with artificial neural networks classification of Algae, based upon flow cytometry data. Anal. Chim. Acta, 258 (1992) 11-25. [Pg.696]

Ball, G. H and Hall, D. J., Isodata, a novel method of data analysis and pattern classification, NTIS Report AD699616 (1965). [Pg.98]

Cover TM, Hart PE (1967) Nearest neighbor pattern classification. IEEE Transact IT-13 21... [Pg.283]

Kowalski, B. R., Jurs, P. C., Isenhour, T. L., Reilley, C. N. Anal. Chem. 41, 1969b, 690-700. Computerized learning machines applied to chemical problems. Multicategory pattern classification by least squares. [Pg.41]

Park, B., Jeong, SK, Lee, WS., Seong, JK, Paik, YK. (2004) A simple pattern classification method for alcohol-responsive proteins that are differentially expressed in mouse brain. Proteomics. 4, 3369-3375... [Pg.166]

Sybrandt LB, Perone SP (1972) Computerized pattern classification of strongly overlapped peaks in stationary electrode polarography. Anal Chem 44 2331-2339. [Pg.149]

All pattern classification methods listed group items by similarity. Measurement of similarity differs depending upon the method, and therefore different methods yield different results. Table 12 describes several similarity metrics and why they produce different results for the same data set. [Pg.542]

R.O. Duda, P.E. Hart, and D.G. Stork, Pattern Classification - 2nd Edition, John Wiley and Sons, New York (2001). [Pg.130]

Duda, R.O. Hart, P.E. "Pattern Classification and Scene Analysis", Wiley, NY, 1973. [Pg.33]

Dasarathy, B. (1991). Nearest Neighbor Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos, CA. [Pg.112]

Czerminski R, Yasri A, Hartsough D. Use of support vector machine in pattern classification Application to QSAR studies. Quant Struct-Act Rel 2001 20 227 10. [Pg.237]


See other pages where Pattern classification is mentioned: [Pg.182]    [Pg.749]    [Pg.758]    [Pg.758]    [Pg.54]    [Pg.159]    [Pg.158]    [Pg.21]    [Pg.147]    [Pg.170]    [Pg.171]    [Pg.690]    [Pg.179]    [Pg.210]    [Pg.173]    [Pg.542]    [Pg.542]    [Pg.380]    [Pg.54]    [Pg.217]    [Pg.22]    [Pg.405]   
See also in sourсe #XX -- [ Pg.301 , Pg.308 ]




SEARCH



© 2024 chempedia.info