Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Feature reduction

Additionally, Breiman et al. [23] developed a methodology known as classification and regression trees (CART), in which the data set is split repeatedly and a binary tree is grown. The way the tree is built, leads to the selection of boundaries parallel to certain variable axes. With highly correlated data, this is not necessarily the best solution and non-linear methods or methods based on latent variables have been proposed to perform the splitting. A combination between PLS (as a feature reduction method — see Sections 33.2.8 and 33.3) and CART was described by... [Pg.227]

One can — and sometimes must — reduce the number of features. One way is to combine the original variables in a smaller number of latent variables such as principal components or PLS functions. This is called feature reduction. [Pg.236]

W. Wu, B. Walczak, W. Pennincks and D.L. Massart, Feature reduction by Fourier transform in pattern recognition of NIR data. Anal. Chim. Acta, 331 (1996) 75-83. [Pg.573]

PCA is a useful tool not only for visualizing information it can also be employed as a strategy for feature reduction and noise reduction, as indicated in Section Vl.C. [Pg.82]

The results of the feature reduction process and classification are extensively tabulated... [Pg.84]

In all examples discussed up to now the radical cation of Qo is involved in the reaction mechanism. However, due to the electronic features reduction of the fullerenes leading to radical anions should be much easier performed. For example, a useful method to synthesize 1-substituted l,2-dihydro-[60]fullerenes is the irradiation of Q0 with ketene silyl acetals (KAs) first reported by Nakamura et al. [216], Interestingly, when unstrained KAs are used, this reaction did not yield the expected [2 + 2]-cycloaddition product either by the thermal, as observed by the use of highly strained ketene silyl acetals [217], or by the photochemical pathway. In a typical reaction Q0 was irradiated for 10 h at 5°C with a high pressure mercury lamp (Pyrex filter) in a degassed toluene solution with an excess amount of the KA in the presence of water (Scheme 11). Some examples of the addition of KAs are summarized in Table 11. [Pg.685]

Several statistics for multivariate tests are known from the literature [AHRENS and LAUTER, 1981 FAHRMEIR and HAMERLE, 1984] the user of statistical packages may find several of them implemented and will rely on their performing correctly. Other, different, tests for separation of groups are used to determine the most discriminating results in discriminant analysis with feature reduction. [Pg.184]

FIGURE 6.27 Mechanism featuring reduction of an excited state (or, more likely, an intermediate derived from it) by X- in parallel with the self-decay of the intermediate. Both of these reactions must take place prior to the observed X7X reaction. Kinetic trace for the decay of Br2 at 360 nm following a 266 nm laser flash of a solution containing 3.8 mM (H20)5Cr0N022+ and 20mM Br in 0.16M HC104. [Pg.257]

Scheme 7.5 Organocatalytic cascade featuring reductive amination with TRIP. Scheme 7.5 Organocatalytic cascade featuring reductive amination with TRIP.
Similar to PCA, LDA is a feature reduction method. For this purpose, a 1-dimensional space, that is, a line, on which the objects will be projected from... [Pg.304]

We have already encountered some reduction methods en route to indole ring construction. Several classical indole name reactions feature reduction as the crucial step in indolization, and these are covered now. [Pg.323]

The mechanism of Cp2TiCl2-catalysed (Cp = cyclopentadiene) reductive crosscoupling of enones with CH2CHCN to form 1,6-difunctionalized ketonitriles (Scheme 18) features reduction of a Ti(ll) complex by zinc to give the active dimeric Ti(lll) catalyst (shown as [Cp2Ti(lll)Cl]). ... [Pg.162]

Wang J, Chiang WC, Hsu YL, and Yang YTC. ECG arrhythmia classification using a probabilistic neural network with a feature reduction method. Neurocomputing 2013 116 38—45. [Pg.200]

Neural Modeling Seventh International Conference, IWANN 2003, Mao, Spain, June 3-6, 2003, Proceedings, Vol. 2687, J. M6ra and J. R. Alvarez, Eds., Springer-Verlag, New York, 2003, pp. 798-805. Feature Reduction Using Support Vector Machines for Binary Gas... [Pg.329]

Adequate data pre-processing. The accuracy of the function learned depends strongly on how the input patterns are represented. Data preprocessing intends to increase the robustness of classification, i.e. by normalisation, and also includes the steps of feature selection and feature reduction that reduce the dimensionality of the classification problem. [Pg.214]

Pyle BH, Broadaway SC, McFeters GA (1992) Efficacy of a>pper and silver ions with iodine in the inactivation of Pseudomonas cepacia. J Appl Bacteriol 72 71—79 Raychaudhuri S, Sutphin PD, Chang JT, Altman RB (2001) Basic microarray analysis grouping and feature reduction. Trends Biotechnol 19 189-193 Reese ET (1957) Biological degradation of cellulose derivatives. Ind Engin Chem 49 89-93... [Pg.338]

These techniques are known under the general name of feature reduction. Besides enabling an easier storage and management of the data, feature-reduction procedures can be crucial for the implementation of optimum inversion algorithms. [Pg.1158]

Many methods have been developed to tackle the issue of high dimensionality of hyperspectral data (Serpico and Bruzzone 1994). In summary, we may say that feature-reduction methods can be divided into two classes feature-selection algorithms (which suitably select a suboptimal subset of the original set of features while discarding the remaining ones) and feature extraction by data transformation which projects the original data space onto a lower-dimensional feature subspace that preserves most of the information, such as nonlinear principal component analysis (NLPCA Licciardi and Del Prate 2011). [Pg.1158]

As the rank of can be at most G — 1, as evident from Equation (20), this represents the maximum number of canonical variates which can be computed, consistently with what was already discussed in the case of two classes, where only a single latent variable can be extracted. It must be stressed here that, whatever the number of categories involved, LDA requires inversion of the pooled (within-groups) covariance matrix S in order for this matrix to be invertible, the total number of training samples should be at least equal to the number of variables, otherwise its determinant is zero and no inverse exists. There are authors who indicate an even larger ratio of the number of samples to the number of variables ( 3) to obtain a meaningful solution. Therefore, these conditions pose a strict limitation to the kind of problems where LDA can be applied, or suggest the need of some form of variable selection/feature reduction prior to the classification analysis. [Pg.198]

The synthesis of yohimbane (120) and alloyohimbane (82) was improved by using a modified sequence which featured reductive photocyclization as the key step (Scheme 3.58) (63,69). Condensation of harmalane (336) with benzoyl chloride afforded enamide 348 which upon irradiation in the presence of sodium borohydride afforded a mixture of functionalized yohimbanes 349, 350, and 351. The overall yields and product distribution were found to be dependent on the solvent employed with the maximum yield of 350 (98%) being obtained in a mixture of MeCN and MeOH. Hydrogenation of 349... [Pg.267]


See other pages where Feature reduction is mentioned: [Pg.215]    [Pg.551]    [Pg.573]    [Pg.187]    [Pg.197]    [Pg.887]    [Pg.77]    [Pg.311]    [Pg.845]    [Pg.28]    [Pg.370]    [Pg.294]    [Pg.100]    [Pg.153]    [Pg.232]    [Pg.195]    [Pg.198]    [Pg.203]    [Pg.203]   
See also in sourсe #XX -- [ Pg.215 , Pg.236 ]

See also in sourсe #XX -- [ Pg.317 ]




SEARCH



Discriminant feature reduction

Feature selection and reduction

Feature space reduction methods

Multiple feature reduction

© 2024 chempedia.info