Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

K Nearest neighbours classification

D. Coomans and D.L. Massart, Alternative K-nearest neighbour rules in supervised pattern recognition. Part 1 K-nearest neighbour classification by using alternative voting mles. Anal. Chim. Acta, 136, 15-27 (1982). [Pg.486]

The preservation of the local distances can be evaluated by using the K-nearest neighbour classification (Chapter 3) and comparing the results for the d-dimensional patterns and for the 2-dimensional representations. [Pg.97]

K-nearest neighbour classification, linear regression, simplex optimization. [Pg.153]

TABLE 13. K-nearest neighbour classification of 13 chemical classes from binary encoded infrared spectra tested with the leave-one-out method. The Taniraoto distance was used because it gave slightly better results than the Hamming distance. P... [Pg.161]

The K nearest neighbour criterion can also be used for classification. Find the distance of die object in question 3 from the nine objects in die table above. Which are die three closest objects, and does tiiis confirm the conclusions in question 3 ... [Pg.257]

B. K. Alsberg, R. Goodacre, J.J. Rowland and D.B. Kell, Classification of Pyrolysis Mass Spectra by Fuzzy Multivariate Rule Induction-comparison with Regression, K-nearest Neighbour, Neural and Decision-tree Methods. Analytica Chimica Acta, 348(1-3) (1997), 389 07. [Pg.408]

Alsberg, B. K., Goodacre, R., Rowland, J. J. and Kell, D. B. (1997) Classification of pyrolysis mass spectra by fuzzy multivariate rule induction comparison with regression, K-nearest neighbour, neural and decision-tree methods. Analytica Chimica Acta, in press. [Pg.369]

Table 1 Classification results on various publicly available test problems from the UCI database using a probabilistic k-nearest-neighbours classifier. Table 1 Classification results on various publicly available test problems from the UCI database using a probabilistic k-nearest-neighbours classifier.
Selection of the supervised classification technique or the combination of techniques suitable for accomplishing the classification task. Popular supervised classifiers are Multi-Layer Perceptron Artificial Neural Networks (MLP-ANN), Support Vector Machines (SVM), k-Nearest Neighbours (k-NN), combinations of genetic algorithms (GA) for feature selection with Linear Discriminant Analysis (LDA), Decision Trees and Radial Basis Function (RBF) classifiers. [Pg.214]

The number of occurrences of a certain substructure in the hitlist is compared with the corresponding number for the library and a probability is derived for the presence of that substructure in the unknown. This classification method is a variant of the well-known -nearest neighbour classification . Each mass spectrum is considered as a point in a multidimensional space the neighbours nearest to the spectrum of the unknown correspond to the most similar reference spectra in library search. If the majority of k neighbours (k is typically between 1 and 10) contain a certain substructure then this substructure is predicted to be present in the unknown. A drawback of this approach is the high computational effort necessary for classifying an unknown because a full library search is required. The performance has been described by Stein (1995, see Further reading section) as sufficient to recommend it for routine use as a first step in structure elucidation . [Pg.241]

The most popular classification methods are Linear Discriminant Analysis (LDA), Quadratic Discriminant Analysis (QDA), Regularized Discriminant Analysis (RDA), K th Nearest Neighbours (KNN), classification tree methods (such as CART), Soft-Independent Modeling of Class Analogy (SIMCA), potential function classifiers (PFC), Nearest Mean Classifier (NMC) and Weighted Nearest Mean Classifier (WNMC). Moreover, several classification methods can be found among the artificial neural networks. [Pg.60]

The choice of value for k is somewhat empirical and, for overlapping classes, fc = 3 or 5 have been proposed to provide good classification. In general, however, fc = 1 is the most widely used case and is referred to as the 1-NN method or, simply, the nearest-neighbour method. [Pg.140]

Each pattern is characterized in the usual way (Chapter 1.2) by a set of d components (features, measurements) and can be considered as a point in a d-dimensional space. An additional component as described in Chapter 1.3 is not necessary for the KNN-method. Classification of an unknown pattern x is made by examination some pattern points with known class membership which are closest to x- In order to find the nearest neighbours of the unknown it is necessary to compute the distances between X and all other pattern points of the available data set. The number of neighbours which are considered for classification is usually denoted by K. If only one neighbour is used for the classification (K=1, 1NN-method") the class membership of the first (nearest) neighbour gives the class membership of the unknown. If more than one neighbour is used a voting scheme or some other procedure is applied to determine the class of the unknown. [Pg.62]

In most chemical applications of the KNN-method, only the first (nearest) neighbour (K=1) was used for classification. For K>1 a simple vot1ng ("one neighbour one vote") may be applied. The contributions of the neighbours to the voting can also be weighted by the distances (or the squared distances) between the unknown and the neighbours. [Pg.64]

FIGURE 19 Representation of classification error in cross-validation as a function of the number of nearest neighbours, for the selection of the optimal value of k. [Pg.223]


See other pages where K Nearest neighbours classification is mentioned: [Pg.14]    [Pg.62]    [Pg.14]    [Pg.62]    [Pg.528]    [Pg.263]    [Pg.278]    [Pg.180]    [Pg.180]    [Pg.238]    [Pg.502]    [Pg.327]    [Pg.463]    [Pg.62]    [Pg.52]    [Pg.233]    [Pg.416]    [Pg.191]    [Pg.221]    [Pg.381]    [Pg.94]    [Pg.550]    [Pg.63]    [Pg.222]   
See also in sourсe #XX -- [ Pg.205 , Pg.205 ]




SEARCH



K nearest neighbours

Nearest neighbours classification

© 2024 chempedia.info