Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Stepwise selection

In forward selection, the first variable selected for an entry into the constructed model is the one with the largest correlation with the dependent variable. Once the variable has been selected, it is evaluated on the basis of certain criteria. The most common ones are Mallows Cp or Akaike s information criterion. If the first selected variable meets the criterion for inclusion, then the forward selection continues, i.e. the statistics for the variables not in the equation are used to select the next one. The procedure stops, when no other variables are left that meet the entry criterion. [Pg.324]

Backward elimination starts with all the variables in the equation and then it sequentially removes them. This approach cannot be applied to the ill-posed settings, but it can be combined with forward selection (so-called stepwise selection). The difference between the forward and the stepwise selection is that in the stepwise selection, after a variable has been entered, all already entered variables are examined in order to check, whether any of them should be removed according to the removal criteria. This testing of the least useful variable currently in the equation is carried out at every stage of the stepwise procedure. A variable that could have been the best entry candidate at an [Pg.324]

According to the results of the study carried out by Derksen and Keselman [13], and concerning several automatic variable selection methods, in a typical case 20 to 74 percent of the selected variables are noise variables. The number of the noise variables selected varies with the number of the candidate predictors and with the degree of collinearity among the true predictors (due to the well-known problem of variance inflation when variables are correlated any model containing correlated variables are unstable). Sereening out noise variables while retaining true predietors seems to be a possible solution to the chance correlation problem in stepwise MLR. [Pg.325]


Most of the supervised pattern recognition procedures permit the carrying out of stepwise selection, i.e. the selection first of the most important feature, then, of the second most important, etc. One way to do this is by prediction using e.g. cross-validation (see next section), i.e. we first select the variable that best classifies objects of known classification but that are not part of the training set, then the variable that most improves the classification already obtained with the first selected variable, etc. The results for the linear discriminant analysis of the EU/HYPER classification of Section 33.2.1 is that with all 5 or 4 variables a selectivity of 91.4% is obtained and for 3 or 2 variables 88.6% [2] as a measure of classification success. Selectivity is used here. It is applied in the sense of Chapter... [Pg.236]

The schematic outline of the stepwise selective activation is shown in Scheme 1.9 thus, a glycosyl donor, bearing a reactive LGa, is coupled with a glycosyl acceptor,... [Pg.38]

A one-pot technique, combining two or more glycosylation steps based on activation of one donor over another, have also been developed [182,189-191]. This one-pot technique is virtually a variation of a simple stepwise selective activation strategy, which allows further improvement in the efficiency of the synthesis by avoiding the necessity for isolation (and purification) of the intermediates. [Pg.39]

The above paragraph describes the forward option of the interval methods, where one starts with no variables selected, and sequentially adds intervals of variables until the stop criterion is reached. Alternatively, one could operate the interval methods in reverse mode, where one starts using all available x variables, and sequentially removes intervals of variables until the stop criterion is reached. Being stepwise selection methods, the interval methods have the potential to select local rather than global optima, and they require careful selection of the interval size (number of variables per interval) based on prior knowledge of the spectroscopy, to balance computation time and performance improvement. However, these methods are rather straightforward, relatively simple to implement, and efficient. [Pg.423]

In the same way as linear discriminant analysis is the most-used classification method, stepwise selection by LDA (SLDA) is the selection method that shows the greatest number of applications in food chemistry. [Pg.134]

This conversion of readily available 2-furyl alcohols into unsaturated pyranosuloses proved a very effective route to racemic monosaccharides, through stepwise, selective functionalization of the enone grouping in 325. The shortest synthesis of a natural compound by following this scheme involves palladium-catalyzed hydrogenation of the aldosulose (325, R = Me) obtained from l-(2-furyl)ethanol, resulting209 in cinerulose A, the sugar component of the antibiotic cine-rubin. [Pg.65]

Ferrero L, Cameron B, Crouzet J Analysis of gyrA and grIA mutations in stepwise-selected ciprofloxacin-resistant mutants of Staphylococcus aureus. Antimicrob Agents Chemother 1995 39 1554. [PMID 7492103]... [Pg.1041]

Stepwise Selective Amine and Amide Alkylation (Fig. 14) 44 A first alkylation step is performed by suspending (78) in a 2 M solution of a suitable alkyl halide in DMF at 50° for 24-48 h. After thorough washing with DMF (3x), CH2C12 (3x), and THF (3x) intermediate (79) (usually formed with >85% purity) is subjected to the final alkylation. The reaction flask is sealed with a fresh rubber septum and flushed with nitrogen followed by cooling to 0°. In a separate flame-dried 25-ml round-bottom flask 12 equiv. (with respect to 79) of 5-phenylmethyl-2-oxazolidinone is added. To the reaction flask freshly distilled THF is added (the appropriate volume to provide a 0.2 M solution of the 5-phenylmethyl-2-oxazolidi-none). The resulting clear solution is then cooled to —78° and 1.6 M n-butyl... [Pg.467]

Lin (1993) suggested using stepwise variable selection and Wu (1993) suggested forward selection or all (estimable) subsets selection. Lin (1993) gave an illustrative analysis by stepwise selection of the data in Table 6. He found that this identified factors 15,12,19,4, and 10 as the active factors, when their main effects are entered into the model in this order. Wang (1995) analyzed the other half of the Williams experiment and identified only one of the five factors that Lin had identified as being nonnegligible, namely, factor 4. [Pg.181]

A stepwise selection procedure is performed to search for QSPR/QSAR models after the preliminary exclusion of - constant and near-constant variables. The - pair correlation cutoff selection of variables is then performed to avoid highly correlated descriptor variables within the model. [Pg.75]

Therefore, the selection of the best subset model can be performed by forward stepwise selection starting from the variable with the lowest p-value (the current model) next each of the variables not yet included in the current model is added to it in turn, producing a set of candidates with corresponding p-values. The candidate model with the lowest p-value is selected and the process is repeated on the new current model. [Pg.472]

Chapter 10 presents forward and stepwise selections of x,- variables, as well as backward elimination, in terms of statistical software. [Pg.512]


See other pages where Stepwise selection is mentioned: [Pg.253]    [Pg.231]    [Pg.1256]    [Pg.227]    [Pg.39]    [Pg.154]    [Pg.1256]    [Pg.92]    [Pg.134]    [Pg.189]    [Pg.266]    [Pg.253]    [Pg.730]    [Pg.454]    [Pg.454]    [Pg.19]    [Pg.388]    [Pg.181]    [Pg.238]    [Pg.147]    [Pg.179]    [Pg.133]    [Pg.324]    [Pg.419]    [Pg.96]    [Pg.97]    [Pg.97]    [Pg.98]    [Pg.98]    [Pg.622]    [Pg.606]   
See also in sourсe #XX -- [ Pg.181 ]




SEARCH



Stepwise

© 2024 chempedia.info