Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Support vector classification

Note that the minimization is taken with respect to both weight vector w and bias b. The solution to the constrained optimization problem of [Pg.34]

The corresponding dual is found by differentiating with respect to w and b, imposing stationarity. [Pg.35]

Note that the primal (2.21) and the corresponding dual (2.26) arise from the same objective function but with different constraints and the solution is found by minimizing the prime or by maximizing the dual. Now to construct the optimal hyperplane one has to find the coefficients a that maximize the function W(a), subject to constraints (2.25) and positivity of the a, with solution w given by (2.24), i.e., [Pg.36]

As an immediate application, note that, while w is explicitly determined by (2.24), the bias b is not, although it is implicitly determined. However b is easily found by using the Karush-Kuhn-Tucker (KKT) complementarity condition, which will be described as follows. [Pg.36]

The Kuhn-Tucker theorem plays a central role in giving conditions for an optimum solution to a general constrained optimization problem. For the primal problem mentioned above, these conditions may be stated [Pg.36]


Newer data analysis methods overcome the difficulties that small sample-to-variabies ratios create for traditional statistical methods. These new methods fall into two major categories (1) support-vector classification and regression methods, and (2) feature selection and construction techniques, The former are effectively determined by only a small portion of the training data (sample), while the latter select only a small subset of variables such that the available sample is enough for traditional and newer classification techniques. [Pg.418]

A Kulkarni, VK Jayaraman, and BD Kulkarni. Support vector classification with parameter tuning assisted by agent-based systems. Comput. Chem. Engg., 28 311-318, 2004. [Pg.288]

Lu, W, Dong, N., Naray-Szabo, G. (2006) Predicting anti-HIV-1 activities of HEPT-analog compounds by using support vector classification. QSAR Comb. Sci., 24, 1021-1025,... [Pg.1109]

Fig. 6.2 Scheme of a neural network, one hidden layer and bias neurons.-Fig. 6.3 Support vector classification, where the classes can be separated.-------------------235... Fig. 6.2 Scheme of a neural network, one hidden layer and bias neurons.-Fig. 6.3 Support vector classification, where the classes can be separated.-------------------235...
In practice, a non-linear model is often required for adequate data fitting. In the same manner as the non-linear support vector classification approach, a non-linear mapping can be used to map the data into a high dimensional feature space where linear regression can be used (see Fig. 2.10). As noted in the previous subsection, the complete SVM can be described in terms of dot products between the data. The nonlinear SVR solution, using an f-insensitive loss function (2.54) is given by solving the problem ... [Pg.50]

The data listed in Table 6.1 are treated by support vector classification. It has been found that the published phase diagram of CsBr-CaBr2 system cannot be classified correctly (The published phase... [Pg.111]

The formability of intermetallic compounds can be investigated by SVM and the atomic parameters suitable for metallic systems, i.e., Midema s electronegativity (< )), metallic radius (R), number of valence electrons (Z) of free atom and parameter and their functions. For example. Table 6.3 lists the data about the formability of ternary intermetallic compounds and related atomic parameters of known Mg-containing ternary alloy systems. By support vector classification with Gaussian kernel, the rate of correctness of classification is 100%, and the rate of correctness of prediction in LOO cross-validation is 94.9%. [Pg.119]

Forty one known 1-1 type intermediate compounds between rare earth and nontransition metals have been used as the training set. By support vector classification method using the kernel function of second degree or Gaussian type of kernels, the separation of these two categories of compounds is rather good. By LOO cross-validation method and support vector classification, the rate of correctness of computerized prediction is more than 92%. [Pg.134]

Using support vector classification with Gaussian kernel, the data set listed in Table 6.10 can be classified with clear-cut boundaries in the feature space. The LOO cross-validation test gives the rate of correctness of prediction equal to 98.9%. [Pg.141]

By support vector classification with linear kernel, the following criterion can be obtained for the separation of 100 structure forming region from 002 structure forming region ... [Pg.179]

LOO cross-validation test is 95.4%. By support vector classification, and select the mutually connected good sample points located far away from optimal plane of separation, we can find some optimal region to avoid the formation of mixed orientation. For example, the following region can be used as optimal regions ... [Pg.181]

By using support vector classification, it can be shown that the criterion to avoid the formation of amorphous structure can be expressed... [Pg.181]

The separation of 002 orientation-forming region and mixed orientation region is not clear. But by using support vector classification an optimal region to assure 002 orientation structure formation can be... [Pg.181]

As an effective way to overcome the problem of overfitting, support vector machine (SVM) [61 132], as a newly developed method, has been introduced in the field of SAR. The paper of Burbidge reported the pioneering work in this field. In this work, it was reported that the prediction ability of support vector classification (SVC) was significantly better than that of artificial neural network and decision tree in the SAR computation for the prediction of the inhibition of dihydrofolatase by pyrimidines [16]. In this chapter, more SAR work using both support vector classification (SVC) and support vector regression (SVR) methods will be described. [Pg.189]

By support vector classification using linear kernel, the porcelain of Hangzhou palace can be separated clearly from the porcelain samples from Ningbo, Shangyu and Shaoxin, but caimot separated from that of Cixi, so it appears that the porcelain samples found in Hangzhou palace were produced at Cixi. [Pg.241]

Table 13.7 illustrates the comparison of different methods. It can be seen that the rate of correctness of the support vector classification is better than those of two other methods. [Pg.270]

Fig. 14.2 Result of outlier elimination by support vector classification. Fig. 14.2 Result of outlier elimination by support vector classification.
Fig. 14.3 Strategy for searching optimal zone by support vector classification. Fig. 14.3 Strategy for searching optimal zone by support vector classification.
LIBSVM 2.5 includes such functions as scale, train, and predict. It supplies Support Vector Classification and Regression, such as C-SVC, v-SVC, One-class SVC, C-SVR and v-SVR. It also provides four kinds of kernel functions. As far as the multi-class problem is concerned, LIBSVM supplies some multi-class methods such as one to one, one-to-rest and SVMDAG. The FAQ on the web gives more detailed information about how to use the LIBSVM. [Pg.315]

Processing the data set using support vector classification (SVC)... [Pg.316]


See other pages where Support vector classification is mentioned: [Pg.140]    [Pg.392]    [Pg.2]    [Pg.34]    [Pg.115]    [Pg.128]    [Pg.134]    [Pg.268]    [Pg.340]   
See also in sourсe #XX -- [ Pg.2 , Pg.34 ]




SEARCH



Support vectors

© 2024 chempedia.info