Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Regression tree

Breiman, L., Friedman, J.H., Olshen, R.A. and Stone, C.J., 1984. Classification and Regression Trees. Wadsworth and Brooks. [Pg.301]

Breiman, L., et al., Classification and Regression Trees. Wadsworth, Belmont, CA, 1984. Clausing, D., Total Quality Development. ASME Press, New York, 1993. [Pg.154]

Bastl, W., and Fenkel, L., Disturbance analysis systems. In Human Diagnosis of System Failures, (Rasmussen and Rouse, eds.) Nato Symp. Denmark, Plenum, New York, 1980. Breiman, L., Friedman, J. H., Olshen, R. A., and Stone, C. J., aassification and Regression Trees. Wadsworth, Belmont, CA (1984). [Pg.268]

Deconinck, E Hancock, T Coomans, D., Massart, D. L, Vander Heyden, Y. Classification of drugs in absorption classes using the classification and regression trees (CART) methodology. [Pg.107]

Additionally, Breiman et al. [23] developed a methodology known as classification and regression trees (CART), in which the data set is split repeatedly and a binary tree is grown. The way the tree is built, leads to the selection of boundaries parallel to certain variable axes. With highly correlated data, this is not necessarily the best solution and non-linear methods or methods based on latent variables have been proposed to perform the splitting. A combination between PLS (as a feature reduction method — see Sections 33.2.8 and 33.3) and CART was described by... [Pg.227]

C.H. Yeh and C.H. Spiegelman, Partial least squares and classification and regression trees. Chemom. and Intell. Lab. Systems, 22 (1994) 17-23. [Pg.240]

PDE1, PDE5 SPECS (88 K), CART regression trees based on 2-point pharmacophores 7 hits/19 tested [71]... [Pg.96]

Classification and regression trees Input partition Adaptive shape, piecewise constant [/3, (], minimum output prediction error... [Pg.34]

A further point for the choice of an appropriate method is, whether the final model needs to be interpreted. In this case, and especially in case of many original regressor variables, the interpretation is in general easier if only a few regressor variables are used for the final model. This can be achieved by variable selection (Section 4.5), Lasso regression (Section 4.8.2), or regression trees (Section 4.8.3.3). Also PCR (Section 4.6) or PLS (Section 4.7) can lead to components that can be interpreted if they summarize the information of thematically related x-variables. [Pg.203]

In Section 4.8.3.3, we already mentioned regression trees which are very similar to classification trees. The main difference is that the response y-variable now represents the class membership of the training data. The task is again to partition the... [Pg.231]

Garzotto M, Beer TM, Hudson RG et al. Improved detection of prostate cancer using classification and regression tree analysis. JCh n Oncol 2005 23 4322 329. [Pg.369]

Canonical Correlation Analysis Principal Component Regressionb Classification and Regression Trees (CART) Linear Learning Machine Neural Networks Adaptive Least Squares Genetic Programming Logistic Regression... [Pg.168]

Examples of mathematical methods include nominal range sensitivity analysis (Cullen Frey, 1999) and differential sensitivity analysis (Hwang et al., 1997 Isukapalli et al., 2000). Examples of statistical sensitivity analysis methods include sample (Pearson) and rank (Spearman) correlation analysis (Edwards, 1976), sample and rank regression analysis (Iman Conover, 1979), analysis of variance (Neter et al., 1996), classification and regression tree (Breiman et al., 1984), response surface method (Khuri Cornell, 1987), Fourier amplitude sensitivity test (FAST) (Saltelli et al., 2000), mutual information index (Jelinek, 1970) and Sobol s indices (Sobol, 1993). Examples of graphical sensitivity analysis methods include scatter plots (Kleijnen Helton, 1999) and conditional sensitivity analysis (Frey et al., 2003). Further discussion of these methods is provided in Frey Patil (2002) and Frey et al. (2003, 2004). [Pg.59]

Breiman L, Friedman JH, Stone CJ, Olshen RA (1984) Classification and regression trees. Belmont, CA, Chapman Hall/CRC Press. [Pg.85]


See other pages where Regression tree is mentioned: [Pg.720]    [Pg.85]    [Pg.96]    [Pg.444]    [Pg.5]    [Pg.41]    [Pg.456]    [Pg.462]    [Pg.484]    [Pg.156]    [Pg.184]    [Pg.184]    [Pg.185]    [Pg.185]    [Pg.203]    [Pg.232]    [Pg.261]    [Pg.432]    [Pg.277]    [Pg.334]    [Pg.213]    [Pg.361]    [Pg.402]    [Pg.413]    [Pg.5]    [Pg.41]   
See also in sourсe #XX -- [ Pg.307 ]

See also in sourсe #XX -- [ Pg.268 , Pg.269 ]




SEARCH



© 2024 chempedia.info