Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Classification trees

Worth AP, Cronin MTD. The use of discriminant analysis, logistic regression and classification tree analysis in the development of classification models for human health effects. J Mol Struct (Theochem) 2003 622 97-111. [Pg.492]

D., Vander Heyden, Y. Classification tree models for the prediction of blood-brain barrier passage of drugs. /. Chem Inf. Model. 2006, 46, 1410-1419. [Pg.107]

Yeh and Spiegelman [24], Very good results were also obtained by using simple neural networks of the type described in Section 33.2.9 to derive a decision rule at each branching of the tree [25]. Classification trees have been used relatively rarely in chemometrics, but it seems that in general [26] their performance is comparable to that of the best pattern recognition methods. [Pg.228]

In Section 4.8.3.3, we already mentioned regression trees which are very similar to classification trees. The main difference is that the response y-variable now represents the class membership of the training data. The task is again to partition the... [Pg.231]

FIGURE 5.16 Full classification tree for the data example in Figure 5.15 in the left panel, and the resulting classification lines in the right panel. The dashed lines will not be used when the tree is pruned to its optimal complexity. [Pg.234]

As a summary, classification trees are a simple but powerful technique for group separation. [Pg.235]

The main limitation of classification trees is their instability. Small changes in the data can result in a completely different tree. This is due to the hierarchical structure of the binary decisions, where a slightly different split on top can cause completely different splits subsequently. A procedure called bagging can reduce this instability by averaging many trees (Hastie et al. 2001). [Pg.235]

The same evaluation scheme as described above for fc-NN is used for classification trees (Section 5.4). The parameter to be optimized is the tree complexity. Figure 5.25... [Pg.250]

FIGURE 5.25 Classification trees for the glass data with six glass types. The optimal parameter for the tree complexity is 0.02. The test error for this parameter choice is 0.35. [Pg.251]

FIGURE 5.28 Comparison of the test errors for the glass data using different classification methods. One hundred replications of the evaluation procedure (described in the text) are performed for the optimal parameter choices (if the method depends on the choice of a parameter). The methods are LDA, LR, Gaussian mixture models (Mix), fc-NN classification, classification trees (Tree), ANN, and SVMs. [Pg.253]

Two groups of objects can be separated by a decision surface (defined by a discriminant variable). Methods using a decision plane and thus a linear discriminant variable (corresponding to a linear latent variable as described in Section 2.6) are LDA, PLS, and LR (Section 5.2.3). Only if linear classification methods have an insufficient prediction performance, nonlinear methods should be applied, such as classification trees (CART, Section 5.4), SVMs (Section 5.6), or ANNs (Section 5.5). [Pg.261]

Classification trees are used to predict membership of cases or objects in the classes of a categorical dependent variable from their measurements on one or more predictor variables. Lewis et al. (334) used the concept of classification trees to design a decision tree for human P450 substrates. The intention was to predict which CYP isozyme will interact with which substrates, based on physicochemical parameters. The resulting classifiers are the volume, the... [Pg.497]

Gerberick, G.F., Vassallo, J.D., Foertsch, L.M., Price, B.B., Chaney, J.G. and Lepoittevin, J.P. (2007) Quantification of chemical peptide reactivity for screening contact allergens a classification tree model approach. Toocicological Sciences, 97, 417 27. [Pg.467]

Huang, Y., Cai, J., Ji, L. and Li, Y. (2004) Classifying G-protein coupled receptors with bagging classification tree. Comput. Biol. Chem 28, 275-280. [Pg.54]

Dudoit et al. also studied some more complex methods such a classification trees and aggregated classification trees. These methods did not appear to perform any better than diagonal linear discriminant analysis or nearest neighbor classification. Ben-Dor et al. (7J also compared several methods on several public datasets and found that nearest neighbor classification generally performed as well or better than more complex methods. [Pg.331]

Gordon et al. (48) recently performed a pathway-based pharmacogenomic study on rectal cancer treated with chemoradiation in which they evaluated 21 polymorphisms in 18 genes involved in the critical pathways of cancer progression (drug metabolism, tumor microenvironment, cell cycle control, and DNA repair). They applied the CART analysis and found that a classification tree with four genes (lL-8, lCAM-1, TGF-, and... [Pg.361]

Figure 6.26 a TON and b TOF classification tree structuresforadatasetof412 Pd-catalyzed Heck reactions described by a total of 74 descriptors. The black and white bars represent positive and negative experiments, respectively. In the case of TON, the most relevant splitting... [Pg.264]

Figure 18.1 Classification tree for distinguishing between corrosive and non-corrosive chemicals on the basis of pH measurements. [Pg.404]


See other pages where Classification trees is mentioned: [Pg.720]    [Pg.53]    [Pg.283]    [Pg.96]    [Pg.227]    [Pg.185]    [Pg.211]    [Pg.231]    [Pg.232]    [Pg.234]    [Pg.235]    [Pg.250]    [Pg.251]    [Pg.253]    [Pg.260]    [Pg.539]    [Pg.53]    [Pg.438]    [Pg.44]    [Pg.256]    [Pg.264]    [Pg.264]    [Pg.452]    [Pg.13]    [Pg.402]    [Pg.115]    [Pg.142]   
See also in sourсe #XX -- [ Pg.227 ]

See also in sourсe #XX -- [ Pg.264 ]

See also in sourсe #XX -- [ Pg.236 ]




SEARCH



© 2024 chempedia.info