Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Problem classification

An example of a classification problem ia which feature weighting and selection was important comes from forensic chemistry (qv). A classification method was needed to determine the paper grade and manufacturer of a paper scrap found at the scene of a crime. In this study, 119 sheets of paper (qv) representing 40 different paper grades and nine manufacturers were obtained (25). The objects were then the paper samples, and the variables consisted of... [Pg.424]

All manufacturers use their own classification numbers for their bits. This results in mass confusion about which bit to use in what formation and whose bit is better. The International Association of Drilling Contractors (lADC) has addressed this classification problem through the development of a unified system. But whose bit is better is left to trial-and-error experimentation by the individual operator. [Pg.769]

Inductive learning by decision trees is a popular machine learning technique, particularly for solving classification problems, and was developed by Quinlan (1986). A decision tree depicting the input/output mapping learned from the data in Table I is shown in Fig. 22. The input information consists of pressure, temperature, and color measurements of... [Pg.262]

The model induced via the decision tree is not a blackbox, and provides explicit and interpretable rules for solving the pattern classification problem. The most relevant variables are also clearly identified. For example, for the data in Table I, the value of the temperature are not necessary for obtaining good or bad quality, as is clearly indicated by the decision tree in Fig. 22. [Pg.263]

This classification problem can then be solved better by developing more suitable boundaries. For instance, using so-called quadratic discriminant analysis (QDA) (Section 33.2.3) or density methods (Section 33.2.5) leads to the boundaries of Fig. 33.2 and Fig. 33.3, respectively [3,4]. Other procedures that develop irregular boundaries are the nearest neighbour methods (Section 33.2.4) and neural nets (Section 33.2.9). [Pg.209]

In Fig. 44.7 an example of a classification problem is shown that cannot be solved by the simple perceptron-like networks. It is known as the exclusive or (XOR) problem. No single boundary can be found that yields a correct class-... [Pg.659]

Fig. 44.8. (a) The structure of the neural network, for solving the XOR classification problem of Fig. 44.7. (b) The two boundary lines as defined by the hidden units in the input space (jtl, 2). (c) Representation of the objects in the space defined by the output values of the two hidden units ( hul, hu2) and the boundary line defined in this space by the output unit. The two objects of class A are at the same location. [Pg.661]

In analytical chemistry, Artificial Neural Networks (ANN) are mostly used for calibration, see Sect. 6.5, and classification problems. On the other hand, feedback networks are usefully to apply for optimization problems, especially nets ofHoPFiELD type (Hopfield [1982] Lee and Sheu [1990]). [Pg.146]

In this chapter, the mathematical formulation of the variable classification problem is stated and some structural properties are discussed in terms of graphical techniques. Different strategies are available for carrying out process-variable classification. Both graph-oriented approaches and matrix-based techniques are briefly analyzed in the context of their usefulness for performing variable categorization. The use of output set assignment procedures for variable classification is described and illustrated. [Pg.44]

In classification problems, it is thus known to which group the objects belong, and the working hypothesis is that the characteristics of the groups are described by the multivariate data structure of the groups objects. The task for the statistical analysis is to summarize this multivariate structure appropriately in order to establish rules for correctly assigning new observations for which the group membership is not known. The mles used for classification should be as reliable as possible such that the number of misclassified objects is as small as possible. [Pg.209]

One has to be careful with the use of the misclassification error as a performance measure. For example, assume a classification problem with two groups with prior probabilities pi = 0.9 and p2 = 0.1, where the available data also reflect the prior probabilities, i.e., nx k, npi and n2 np2. A stupid classification rule that assigns all the objects to the first (more frequent) group would have a misclassification error of about only 10%. Thus it can be more advisable to additionally report the misclassification rates per group, which in this case are 0% for the first group but 100% for the second group which clearly indicates that such a classifier is useless. [Pg.243]

In this example, we apply D-PLS (PLS discriminant analysis, see Section 5.2.2) for the recognition of a chemical substructure from low-resolution mass spectral data. This type of classification problems stood at the beginning of the use of multivariate data analysis methods in chemistry (see Section 1.3). [Pg.254]

If it is known from the history of the samples illustrated in Figure 3 that the items represent -distinctly different groups or classes, a classification problem can be formulated. These classes may result from samples being derived from different locations or of different species, etc. In this study. [Pg.205]

Classification To illustrate the use of SIMCA in classification problems, we applied the method to the data for 23 samples of Aroclors and their mixtures (samples 1-23 in Appendix I). In this example, the Aroclor content of the three samples of transformer oil was unknown. Samples 1-4, 5-8, 9-12 and 13-16, were Aroclors 1242, 1248, 1254, and 1260, respectively. Samples 17-20 were 1 1 1 1 mixtures of the Aroclors. Application of SIMCA to these data generated a principal components score plot (Figure 12) that shows the transformer oil is similar, but not... [Pg.216]

As in any classification problem, there is a tradeoff between the rate of recall, or proportion of correct substructures detected, and the reliability, or avoidance of false positive assertions. It is rather the exception than the rule for an observation to have a single, unequivocal explanation. When reasonable alternative interpretations are possible, a decision must be made about what to report. At one extreme, all possibilities could be asserted, ensuring 100% recall (i.e. no substructure which is actually present will fail to be detected) at the cost of a high rate of false positives. [Pg.352]

Although the SIMCA method is very versatile, and a properly optimized model can be very effective, one must keep in mind that this method does not use, or even calculate, between-class variability. This can be problematic in special cases where there is strong natural clustering of samples that is not relevant to the problem. In such cases, the inherent interclass distance can be rather low compared to the mtraclass variation, thus rendering the classification problem very difficult. Furthermore, from a practical viewpoint, the SIMCA method requires that one must obtain sufficient calibration samples to fully represent each of the J classes. Also, the on-line deployment of a SIMCA model requires a fair amount of overhead, due to the relatively large number of parameters and somewhat complex data processing instructions required. However, there are several current software products that facilitate SIMCA deployment. [Pg.397]

The SIMCA method of pattern recognition is in a comprehensive set of programs for classification, and we have discussed how it works in this regard. Classification problems represent only a few of types of problems that can be solved with this approach. [Pg.249]


See other pages where Problem classification is mentioned: [Pg.182]    [Pg.419]    [Pg.424]    [Pg.425]    [Pg.1780]    [Pg.261]    [Pg.442]    [Pg.453]    [Pg.453]    [Pg.457]    [Pg.503]    [Pg.216]    [Pg.660]    [Pg.692]    [Pg.85]    [Pg.21]    [Pg.22]    [Pg.159]    [Pg.160]    [Pg.161]    [Pg.11]    [Pg.51]    [Pg.346]    [Pg.209]    [Pg.210]    [Pg.227]    [Pg.231]    [Pg.242]    [Pg.75]    [Pg.85]    [Pg.397]   
See also in sourсe #XX -- [ Pg.442 , Pg.453 , Pg.457 ]




SEARCH



Approaches for Solving the Classification Problem

Classification of Optimization Problems

Classification of Sturm-Liouville Problems

Classification problems, use

Formal Classification of Selectivity Problems

Linearly separable classification problems

Optimal control problems classification

Problem Classification Category

Problems and Opportunities in Classification

Supervised classification problems

Synthesis problems, classification

© 2024 chempedia.info