Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Naive Bayes method

A machine-learning method was proposed by Klon et al. [104] as an alternative form of consensus scoring. The method proved unsuccessful for PKB, but showed promise for the phosphatase PTPIB (protein tyrosine phosphatase IB). In this approach, compounds were first docked into the receptor and scored using conventional means. The top scoring compounds were then assumed to be active and used to build a naive Bayes classification model, all compounds were subsequently re-scored and ranked using the model. The method is heavily dependent upon predicting accurate binding... [Pg.47]

Clearly, the constant can be included into threshold value B, so that the function /o(C) = 1 is not necessary. We must stress that in such form the probabilistic approach has no tuned parameters at all. Some tuning of naive Bayes classifier can be performed by selection of the molecular structure descriptors [or /(C)] set. This is a wonderful feature in contrast to QSAR methods, especially to Artificial Neural Networks. [Pg.194]

A naive Bayes classifier is a simple probabilistic classifier based on the so-called Bayes theorem with strong independence assumptions and is particularly suited when the dimensionality of the inputs is high. The naive Bayes model assumes that, given a class r = j, the features X, are independent. Despite its simplicity, the naive Bayes classifier is known to be a robust method even if the independence assumption does not hold (Michalski and Kaufman, 2001). [Pg.132]

Prediction of Drug-Induced PT Toxicity and Injury Mechanisms with an hiPSC-Based Model and Machine Learning Methods The weak points of the HPTC- and hESC-based models described previously (Sections 23.3.2.1 and 23.3.3.1) were the data analysis procedures. In order to improve result classification, the raw data obtained with three batches of HPTC and the 1L6/1L8-based model (Li et al., 2013) were reanalyzed by machine learning (Su et al., 2014). Random forest (RE), support vector machine (SVM), k-NN, and Naive Bayes classifiers were tested. Best results were obtained with the RF classifier and the mean values (three batches of HPTC) ranged between 0.99 and 1.00 with respect to sensitivity, specificity, balanced accuracy, and AUC/ROC (Su et al., 2014). Thus, excellent predictivity could be obtained by combining the lL6/lL8-based model with automated classification by machine learning. [Pg.378]

The Bayesian approach is one of the probabilistic central parametric classification methods it is based on the consistent apphcation of the classic Bayes equation (also known as the naive Bayes classifier ) for conditional probabihty [34] to constmct a decision rule a modified algorithm is explained in references [105, 109, 121]. In this approach, a chemical compound C, which can be specified by a set of probability features (Cj,...,c ) whose random values are distributed through all classes of objects, is the object of recognition. The features are interpreted as independent random variables of an /w-dimensional random variable. The classification metric is an a posteriori probability that the object in question belongs to class k. Compound C is assigned to the class where the probability of membership is the highest. [Pg.384]


See other pages where Naive Bayes method is mentioned: [Pg.271]    [Pg.271]    [Pg.122]    [Pg.25]    [Pg.60]    [Pg.185]    [Pg.209]    [Pg.46]    [Pg.212]    [Pg.385]    [Pg.133]    [Pg.52]    [Pg.424]    [Pg.336]    [Pg.373]   


SEARCH



Naive

Naive Bayes

© 2024 chempedia.info