Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Kernel regressions

A first evaluation of the data can be done by running nonparametric statistical estimation techniques like, for example, the Nadaraya-Watson kernel regression estimate [2]. These techniques have the advantage of being relatively cost-free in terms of assumptions, but they do not provide any possibility of interpreting the outcome and are not at all reliable when extrapolating. The fact that these techniques do not require a lot of assumptions makes them... [Pg.72]

FIGURE 4.28 Visualization of kernel regression with two Gaussian kernels. The point sizes reflect the influence on the regression model. The point sizes in the left plot are for the solid kernel function, those in the right plot are for the dashed kernel function. [Pg.184]

Cedeno, W. and Agrafiotis, D.K. (2003) Using particle swarms for the development of QSAR models based on fC-nearest neighbor and kernel regression./. Comput. Aid. Mol Des., 17, 255-263. [Pg.1006]

It has been found that the C parameter has no effect on potynomial kernel regression for both the first and the second order as shown on Fig. 5. [Pg.214]

Ouyang Z, Clyde MA, Wolpert RL (2008) Bayesian kernel regression and elassification, bayesian model selection and objective methods. Gainesville, NC... [Pg.193]

A modified Auto Associative Kernel Regression method for robust signal reconstruction in Nuclear Power Plant components... [Pg.917]

Baraldi, P., Zio, E., Mangili, E, Gola, G., Nystad, B.H., 2013b. Ensemble of Kernel Regression Models for Assessing the Health State of Choke Valves in Offshore Oil Platforms. International Journal of Computational Intelligence Systems, on-line. [Pg.944]

Rosipal R, Trejo LJ. Kernel partial least squares regression in reproducing Kernel Hilbert space. J Machine Learning Res 2001 2 97-123. [Pg.465]

Bennett KP, Embrechts MJ. An optimization perspective on kernel partial least squares regression. In Suykens JAK, Horvath G, Basu S, Micchelli J, Vandewalle J, editors. Advances in learning theory methods, models and applications. Amsterdam lOS Press, 2003. p. 227-50. [Pg.465]

There are various procedures to estimate the unknown regression parameters and the parameters for the kernel functions. One approach is to estimate prototypes ntj and scale parameters Sj separately by clustering methods, and then to estimate the regression parameters, however, this approach does not incorporate information of the y-variable. Another approach is to use optimization techniques to minimize the RSS for the residuals y, - fix,) obtained via Equation 4.95, for i = 1,..., n. [Pg.184]

Like ANNs, SVMs can be useful in cases where the x-y relationships are highly nonlinear and poorly nnderstood. There are several optimization parameters that need to be optimized, including the severity of the cost penalty , the threshold fit error, and the nature of the nonlinear kernel. However, if one takes care to optimize these parameters by cross-validation (Section 12.4.3) or similar methods, the susceptibility to overfitting is not as great as for ANNs. Furthermore, the deployment of SVMs is relatively simpler than for other nonlinear modeling alternatives (such as local regression, ANNs, nonlinear variants of PLS) because the model can be expressed completely in terms of a relatively low number of support vectors. More details regarding SVMs can be obtained from several references [70-74]. [Pg.389]

Froehlich, H., Wegner, J.K., Sieker, F. and Zell, A. (2006) Kernel functions for attributed molecular graphs - a new similarity-based approach to ADME prediction in classification and regression. QSAR Combinatorial Science, 25, 317-326. [Pg.40]

Support Vector Machine (SVM) is a classification and regression method developed by Vapnik.30 In support vector regression (SVR), the input variables are first mapped into a higher dimensional feature space by the use of a kernel function, and then a linear model is constructed in this feature space. The kernel functions often used in SVM include linear, polynomial, radial basis function (RBF), and sigmoid function. The generalization performance of SVM depends on the selection of several internal parameters of the algorithm (C and e), the type of kernel, and the parameters of the kernel.31... [Pg.325]

We notice that the correlation function defined by Eq. (147) is stationary. Thus, it fits the Onsager principle [101], which establishes that the regression to equilibrium of an infinitely aged system is described by the unperturbed correlation function. The authors of Ref. 102 have successfully addressed this issue, using the following arguments. According to an earlier work [96] the GME of infinite age has the same time convoluted structure as Eq. (59), with the memory kernel T(t) replaced by (1>,XJ (f). They proved that the Laplace transform of Too is... [Pg.429]

The development of new data analysis methods is also an important area of QSAR research. Several methods have been developed in recent years, and these include kernel partial least squares (K-PLS) [92], robust continuum regression [93], local lazy regression [94], fuzzy interval number -nearest neighbor (FINkNN) [95], and fast projection plane classifier (FPPC) [96], These methods have been shown to be useful for the prediction of a wide variety of target properties, which include moisture, oil, protein and starch... [Pg.232]

Zhang, P., Lee, C., Verweij, H., Akbar, S.A., Hunter, G. and Dutta, P.K. (2007) High temperatiue sensor array for simultaneous determination of O2, CO, and CO2 with kernel ridge regression data analysis. Sens. Actuators B, 123 (2), 950-63. [Pg.476]


See other pages where Kernel regressions is mentioned: [Pg.93]    [Pg.145]    [Pg.175]    [Pg.219]    [Pg.917]    [Pg.917]    [Pg.918]    [Pg.921]    [Pg.1881]    [Pg.394]    [Pg.93]    [Pg.145]    [Pg.175]    [Pg.219]    [Pg.917]    [Pg.917]    [Pg.918]    [Pg.921]    [Pg.1881]    [Pg.394]    [Pg.5]    [Pg.293]    [Pg.183]    [Pg.184]    [Pg.389]    [Pg.59]    [Pg.128]    [Pg.5]    [Pg.227]    [Pg.405]    [Pg.664]    [Pg.66]    [Pg.88]    [Pg.679]    [Pg.680]    [Pg.194]    [Pg.120]   
See also in sourсe #XX -- [ Pg.145 ]




SEARCH



Kernel ridge regression

© 2024 chempedia.info