Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Support-vector machines

The notion of support vector machines (SVM) was conceived by V. Vapnik in the mid-1990s [51, 313, 314]. Initially, SVMs were designed to solve binary classification [Pg.234]

If the two classes are not linearly separable, two additional strategies within the SVM can be used. First, data points on the wrong side of the margin are allowed, but their influence is minimized. Second, by means of a base extension (see Subsection 6.1.3), data points can be mapped into a space of higher dimension, where they may be separated more easily. [Pg.235]

These strategies can be written and solved as a quadratic optimization problem with linear inequalities as additional constraints (see e.g. [117]). In the present work we use the implementation libsvm [43] that is also available in the R package el071 [207]. For a binary classification, the predicting function is of the form [Pg.235]

Linear kernel Polynomial of degree d Radial base function Sigmoid function  [Pg.236]

In the recent past SVM have been increasingly used to solve problems in computational chemistry. In a comparison of SVM and ANN for classification of pharmaceutically inactive or active compounds, SVM consistently yielded smaller classification errors [41]. For the classification of mass spectra (see Subsection 8.5.2), SVM with a radial kernel proved to be the best predicting functions. [Pg.236]

A polynomial function is applied to return the inner products of descriptor vectors and the separating hyperplane is defined as [Pg.17]

The most recent advance in machine-learning modeling to gamer widespread application by fields outside of artificial intelligence itself is the support vector machine (SVM). SVM s were first developed by Vapnik in 1992.  [Pg.368]

SVM s are an outgrowth of kernel methods. In such methods, the data is transformed with a kernel equation (such as a radial basis function) and it is in this mathematical space that the model is built. Care is taken in the constmction of the kernel that it has a sufficiently high dimensionality that the data become linearly separable within it. A critical subset of transformed data points, the support vectors , are then used to specify a hyperplane called a large-margin discriminator that effectively serves as a hnear model within this non-hnear space. An introductory exploration of SVM s is provided by Cristianini and Shawe-Taylor and a thorough examination of their mathematical basis is presented by Scholkopf and Smola.  [Pg.368]

SVM s have been shown to have superior performance to ANN s in non-linear modeling studies such as optical character recognition. In our hands, initial applications of SVM s to the prediction of ADMET properties from molecular structure indicate that their overall error performance is similar to that of ANN s, but that the predicted values have a smoother distribution and a slightly less-compressed range. [Pg.369]

To learn nonlinear relations with a linear machine, a set of nonlinear features are selected and the data are rewritten in a new representation. This is achieved by appMng a fixed nonlinear mapping of the data to a feature space where the linear machine can be used. The set of h3q)otheses [Pg.66]

Linear learning machines can be expressed in a dual representation, enabling expression of the hypotheses as a linear combination of the training point (xj) so that the decision rule can be evaluated by using just inner products between the test points (x) and the training points  [Pg.67]

Kernels are used to compute the inner product 4 xi) (f x)) directly in the feature space as a function of input points and to merge the two steps of the nonlinear learning machine. [Pg.68]

In both the dual solution and decision function, only the inner product in the attribute space and the kernel function based on attributes appear, but not the elements of the very high dimensional feature space. The constraints in the dual solution imply that only the attributes closest to the hyperplane, the so-called SVs, are involved in the expressions for weights w. Data points that are not SVs have no influence and slight variations in them (for example caused by noise) will not affect the solution, provides a more quantitative leverage against noise in data that may prevent linear separation in feature space [42]. Imposing the requirement that the kernel satisfies Mercer s conditions (K(xj, must be positive semi-definite) [Pg.68]

Consequently, the optimization in Eq. 3.79 is convex and has a unique solution that can be found efficiently, ruling out the problem of local minima encountered in training neural networks [42]. [Pg.68]


N. Christiani, J. Shawe-Taylor, Support Vector Machines. Cambridge University Press, Cambridge, UK, 2000. [Pg.224]

Jorissen RN, Gilson MK. Virtual screening of molecular databases using a Support Vector Machine. J Chem Inf Model 2005 45 549-61. [Pg.208]

Zernov VV, Balakin KV, Ivaschenko AA, Savchuk NP, Pletnev IV. Drug discovery using support vector machines. The case studies of drug-likeness, agrochemical-likeness, and enzyme inhibition predictions. J Chem Inf Comput Sci 2003 43(6) 2048-56. [Pg.318]

Cristianini N, Shawe-Taylor J. An introduction to support vector machines and other kernel-based learning methods. Cambridge, UK Cambridge University Press, 2000. [Pg.349]

PM-CSVM positive majority consensus support vector machines... [Pg.86]

Ivanciuc, O. Applications of support vector machines in chemistry. In Reviews in Computational Chemistry, Upkowitz,... [Pg.108]

A variety of other QSAR-type models for the prediction of plasma protein binding have also been published recently, including neural networks/support vector machines [64], 4-D fingerprints [65], and TOPS-MODE descriptors [66]. [Pg.461]

Ovidiu Ivanciuc, Applications of Support Vector Machines in Chemistry. [Pg.450]

Tobita, M., Nishikawa, T. and Nagashima, R. (2005) A discriminant model constructed by the support vector machine method for HERG potassium channel inhibitors. Bioorganic el Medicinal Chemistry Letters, 15, 2886-2890. [Pg.125]

Christianini, N., Shawe-Taylor, J. An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge University Press, Cambridge, NY, 2000. Crawford, L. R., Morrison, J. D. Anal. Chem. 40, 1968, 1469-1474. Computer methods in analytical mass spectrometry. Empirical identification of molecular class. [Pg.261]

Meyer, D., Leisch, F., Homik, K. Neurocomputing 55, 2003, 169-186. The support vector machine under test. [Pg.262]

Steinwart, I., Christmann, A. Support Vector Machines. Springer, New York, 2008. [Pg.263]

Thissen, U., Pepers, M., Ustiin, B., Meissen, W. J., Buydens, L. C. M. Chemom. Intell. Lab. Syst. 73, 2004, 169-179. Comparing support vector machines to PLS for spectral regression applications. [Pg.263]

Xu, Y., Zomer, S., Brereton, R. G. Crit. Rev. Anal. Chem. 34, 2006, 177-188. Support vector machines A recent method for classification in chemometrics. [Pg.263]

Self-organizing map Singular value decomposition Support vector machine... [Pg.309]

Huanxiang L, Xiaojun Y, Ruisheng Zh, Mancang L, Zhide H, Botao F (2005) Accurate quantitative structure-property relationship model to predict the solubility of C60 in various solvents based on a novel approach using a least-squares support vector machine. J. Phys. Chem. Sect B. 109 20565-20571. [Pg.349]

Support Vector Machines (SVMs) generate either linear or nonlinear classifiers depending on the so-called kernel [149]. The kernel is a matrix that performs a transformation of the data into an arbitrarily high-dimensional feature-space, where linear classification relates to nonlinear classifiers in the original space the input data lives in. SVMs are quite a recent Machine Learning method that received a lot of attention because of their superiority on a number of hard problems [150]. [Pg.75]


See other pages where Support-vector machines is mentioned: [Pg.148]    [Pg.314]    [Pg.326]    [Pg.338]    [Pg.364]    [Pg.450]    [Pg.498]    [Pg.86]    [Pg.98]    [Pg.108]    [Pg.108]    [Pg.388]    [Pg.115]    [Pg.116]    [Pg.123]    [Pg.125]    [Pg.324]    [Pg.338]    [Pg.395]    [Pg.18]    [Pg.211]    [Pg.237]    [Pg.237]    [Pg.252]    [Pg.187]   
See also in sourсe #XX -- [ Pg.148 , Pg.326 , Pg.338 , Pg.364 , Pg.498 ]

See also in sourсe #XX -- [ Pg.86 , Pg.98 , Pg.108 ]

See also in sourсe #XX -- [ Pg.75 ]

See also in sourсe #XX -- [ Pg.2 , Pg.128 , Pg.137 , Pg.145 , Pg.149 ]

See also in sourсe #XX -- [ Pg.181 , Pg.182 ]

See also in sourсe #XX -- [ Pg.295 ]

See also in sourсe #XX -- [ Pg.93 ]

See also in sourсe #XX -- [ Pg.500 ]

See also in sourсe #XX -- [ Pg.66 , Pg.191 ]

See also in sourсe #XX -- [ Pg.958 ]

See also in sourсe #XX -- [ Pg.128 , Pg.137 , Pg.145 , Pg.149 ]

See also in sourсe #XX -- [ Pg.2 , Pg.128 , Pg.137 , Pg.145 , Pg.149 ]

See also in sourсe #XX -- [ Pg.184 , Pg.198 , Pg.200 , Pg.207 , Pg.208 ]

See also in sourсe #XX -- [ Pg.500 ]

See also in sourсe #XX -- [ Pg.256 , Pg.259 ]

See also in sourсe #XX -- [ Pg.352 , Pg.353 ]

See also in sourсe #XX -- [ Pg.390 ]

See also in sourсe #XX -- [ Pg.136 , Pg.309 ]

See also in sourсe #XX -- [ Pg.291 , Pg.302 , Pg.348 , Pg.351 , Pg.372 , Pg.375 , Pg.378 , Pg.379 ]




SEARCH



Active learning support vector machines

Applications of Support Vector Machines in Chemistry

Least squares support vector machine

Library for Support Vector Machines (LibSVM)

Linear classification support vector machine classifiers

Linear support vector machines

Nonlinear classification, support vector machine classifiers

Nonlinear support vector machines

One-class support vector machines

Pattern Classification with Linear Support Vector Machines

Statistical learning support vector machine

Supervised learning support vector machines

Supervised support vector machines

Support Vector Machine Data Processing Method for Problems of Small Sample Size

Support Vector Machines for Classification

Support vector machine algorithm

Support vector machine modeling

Support vector machines SVMs)

Support vector machines based

Support vector machines based applications

Support vector machines linear classifiers

Support vector machines nonlinear classifiers

Support vector machines relationships

Support vector machines, classification and

Support vectors

Supported vector machine

Supported vector machine

© 2024 chempedia.info