Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Support vector regression

In this section we first introduce the definition of the f-insensitive loss function, then show that the same quadratic optimization technique that was used in Section 2.3 for constructing approximations to indicator functions provides an approximation to real-valued functions, involving the linear case and nonlinear case. [Pg.44]

In support vector regression [132], our goal is to find a function /(x) that has at most e deviation fi-om the actually obtained targets for all the training data, and at the same time is as flat as possible. In other words, we do not care about errors as long as they are less than e, but will not accept any deviation larger than this. [Pg.44]

Definition 2.3 The (linear) f-insensitive loss function Z(x,y,/) is defined by [Pg.44]

We begin by describing the case of linear functions /, taking the form [Pg.45]

Flatness in the case of (2.56) means that one seeks a small w [4], One way to ensure this is to minimize the norm, i-e. w = (w w). We can write this problem as a convex optimization problem  [Pg.45]


Support Vector Machine (SVM) is a classification and regression method developed by Vapnik.30 In support vector regression (SVR), the input variables are first mapped into a higher dimensional feature space by the use of a kernel function, and then a linear model is constructed in this feature space. The kernel functions often used in SVM include linear, polynomial, radial basis function (RBF), and sigmoid function. The generalization performance of SVM depends on the selection of several internal parameters of the algorithm (C and e), the type of kernel, and the parameters of the kernel.31... [Pg.325]

EPSVR Support Vector Regression and Meta server consensus http //sysbio.unl.edu/EPSVR/ Liang et al. (40) ... [Pg.134]

Liang S, Zheng D, Standley DM et al (2010) EPSVR and EPMeta prediction of antigenic epitopes using support vector regression and multiple server results. BMC Bioinformatics 11 381... [Pg.137]

M. Song, C. M. Breneman, J. Bi, N. Sukumar, K. P. Bennett, S. Cramer and N. Tugcu, Prediction of protein retention times in anion-exchange chromatography systems using support vector regression., J. Chem. Inf. Comput. Sci., 2002, 42, 1347-1357. [Pg.324]

EM Jordaan and GF Smits. Estimation of the regularization parameter for support vector regression. In Proc. World Conf. Computational Intelligence, pages 2785-2791, Honolulu, Hawaii, 2002. [Pg.286]

Ivanciuc, O. (2005) Support vector regression quantitative structure-activity relationships (QSAR) for benzodiazepine receptor ligands. Internet Electron. ]. Mol. Des., 4, 181-193. [Pg.1075]

Yang, S., Lu, W., Chen, N. and Hu, Q.-N. (2005) Support vector regression based QSPR for the prediction of some physico-chemical properties of alkyl benzenes./. Mol. Struct. (Theochem), 719, 119-127. [Pg.1204]

Support Vector regression, Gaussian process, random forest N= 110 literature compounds and N = 550 in house compounds. 3 fold cross validation RMSE 0.6 in cross validation. ChemAxon, MOE, VolSurf descriptors. Ensemble of models. 44... [Pg.316]

Clarke, S., Griebsch, J., and Simpson, T. Analysis of support vector regression for approximation of complex engineering analyses. Journal of Mechanical Design, 127(6) 1077-1087, 2005. [Pg.211]

The obtained data set has been used to determine for each indicator the function f which optimizes the mean squared error considering different given regression methods polynomial and support vector regressions (SVR). [Pg.213]

To sum up, polynomial regression showed that it is possible to estimate start monitoring indicators, with different accuracies. The support vector regression results are now considered. [Pg.214]

Support vector regression (SVR) is a non-parametric method (Cristianini 2000, Vapiuk 1998). The idea is to spread the data in a high dimensional space with a function in order to be able to find a linear regression in this transformed space. The problem formulation is ... [Pg.214]

Fig. 3.26 Regression palette from component window showing epsilon support vector regression (SVR)... Fig. 3.26 Regression palette from component window showing epsilon support vector regression (SVR)...
Fig. 3.27 Result of Epsilon support vector regression (SVR) for selected examples... Fig. 3.27 Result of Epsilon support vector regression (SVR) for selected examples...
Nalbantov G, Groenen PJF, Bioch JC (2005) Support vector regression basics 13(1) 1-19... [Pg.193]

Leong MK, Chen YM, Chen TH (2009) Prediction of human cytochrome P450 2B6-substrate interactions using hierarchical support vector regression approach. J Comput Chem 30 1899-1909... [Pg.695]

In addition to the set of adjustable coefficients and b contained in Eq. (13.7), the method CMF also requires ealeulation of a certain nmnber of adjustable parameters. Among them are the parameter v for the support vector regression method v-S VR and the ridge parameter y for KRR. Their values should be optimized with the aim to improve the predictive eapability of the model constracted. In addition, for each molecular field one can adjust the values of up to two parameters (attenuation factor, which is related to the width of the Gaussian function) and hj (mixing coefficient, which has the meaning of the relative contribution of molecular field of the ythtype). [Pg.438]

Regression Support Vector Regression (SVR) [5], Kernel Ridge Regression (KRR) [10], Kernel Partial Least Squares (KPLS) [28], Gaussian Processes for Regression (GP-R)[11] QSAR/QSPR... [Pg.454]

Similar to the SVM classification, the selection of model parameters is an important requirement for the regression too. In this case of support vector regression (SVR), we have to tune not only the cost parameter C > 0 and the parameters specific to each kernel, as in the classification, but also the penalty parameter e or p, depending on the type... [Pg.353]

LA. Naguib, and H.W. Darwish, Support vector regression and artificial neural network models for stability indicating analysis of mebeverine hydrochloride and sulpiride mixtures in pharmaceutical preparation A comparative study, Spectrochim. Acta A, 86, 515-526, 2012. [Pg.362]

J. Smola and B. Scholkopf, A tutorial on support vector regression, Stat. Comp., 2004, 14(3), 199-222. [Pg.410]

H. Drucker, J. C. Burges, L. Kaufman, A. J. Smola and V. Vapnik, Support vector regression machines, in Advances in Neural Information Processing Systems 9, 1997 (NIPS 1996), pp. 155-161. [Pg.410]

Reasons for errors and outliers in prediction models are summarized with respect to cross-validations methods, such as leave-one-out. Furthermore, some case studies are discussed which make use of support vector regression, an emerging technique in QSAR. [Pg.112]


See other pages where Support vector regression is mentioned: [Pg.99]    [Pg.104]    [Pg.463]    [Pg.319]    [Pg.325]    [Pg.12]    [Pg.134]    [Pg.27]    [Pg.360]    [Pg.584]    [Pg.315]    [Pg.186]    [Pg.187]    [Pg.137]    [Pg.170]    [Pg.438]    [Pg.392]    [Pg.112]    [Pg.126]    [Pg.126]    [Pg.129]    [Pg.570]   
See also in sourсe #XX -- [ Pg.325 ]

See also in sourсe #XX -- [ Pg.360 , Pg.584 ]

See also in sourсe #XX -- [ Pg.2 , Pg.6 , Pg.19 , Pg.44 ]




SEARCH



Support vectors

© 2024 chempedia.info