Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Hierarchic criteria

The probable route and structures included are identified using the following hierarchical criteria 1) direct and shortest structural connection, 2) interconnection of faults, 3) average breakthrough time of tracer which is directly correlated to tracer concentration and 5) the location of major and minor feed zones within the respective wells. [Pg.122]

Within the actual framework of long-range softness DFT, the concepts of local and kernel electrophilicity were developed toards the local to global hierarchical criteria such as bilocal sjmimetry, asymptotic behavior, and integral local to global relationships. Accordingly, with (4.210) in Eq. (4.236), the associated electrophilicity kernel looks like (Putz Chattaraj, 2013)... [Pg.229]

As oversimplified cases of the criterion to be used for the clustering of datasets, we may consider some high-quality Kohonen maps, or PCA plots, or hierarchical clustering. [Pg.208]

I. Bondarenko, H. Van Malderen, B. Treiger, P. Van Espen and R. Van Grieken, Hierarchical cluster analysis with stopping rules built on Akaike s information criterion for aerosol particle classification based on electron probe X-ray microanalysis. Chemom. Intell. Lab. Syst., 22 (1994) 87-95. [Pg.85]

The earliest works of trying to model different length scales of damage in composites were probably those of Halpin [235, 236] and Hahn and Tsai [237]. In these models, they tried to deal with polymer cracking, fiber breakage, and interface debonding between the fiber and polymer matrix, and delamination between ply layers. Each of these different failure modes was represented by a length scale failure criterion formulated within a continuum. As such, this was an early form of a hierarchical multiscale method. Later, Halpin and Kardos [238] described the relations of the Halpin-Tsai equations with that of self-consistent methods and the micromechanics of Hill [29],... [Pg.106]

Eqn.(4.31b) has the advantage over eqn.(4.24) of being a continuous criterion rather than a combination of two separate ones used hierarchically. We have seen before that the use of P suffers from the insensitivity of this criterion to changes in the range of high P values (P = 1) and in the range of badly resolved peaks (P = 0). The use of eqn.(4.31 b) will eliminate the first problem, but the latter problem will remain. [Pg.151]

Guenoche A, Hansen P, Jaumard B, Efficient algorithms for divisive hierarchical clustering with diameter criterion, J. Classif, 8 5-30, 1991. [Pg.365]

A basic question of whether hierarchical or nonhierarchical cluster analysis is used deals with the correct or best number of groups in a data set. The notion of best relates not only to a criterion value or large break in a dendrogram, but to the research objectives as well. We can not resist quoting from Everitt (48) what is probably the ultimate word regarding the number of groups ... [Pg.71]

Thus, the singular perturbation generates a hierarchical structure consisting of fast and slow movements. Clearly, we must now ask under what criterion we can justify, based on the flow in Fig. 3, that the flow for a small and positive e would be the one in Fig. 4. [Pg.345]

Third, establish the relative significance (weight) of each criterion. This usually is accomplished via a set of pairwise comparisons among the different criteria. In each pairwise comparison, two criteria on the same hierarchical level are directly compared. The decision maker (in this case, the study team) establishes the importance of one criteria relative to the other. All unique pairs of criteria at each level of the hierarchy are compared via such pairwise comparisons until all possible combinations have been compared. AHP then translates the pairwise comparison results into a relative weight for each criterion. [Pg.376]

After this general preselection, it can be advantageous to apply further steps of hierarchical filtering. As mentioned above, this could involve the selection of functional groups inevitably required to anchor a ligand to the most prominent interaction sites. Subsequently, the information of the "hot spot" analysis—translated into a pharmacophore hypothesis—can be used as matching criterion for a fast database screen. Such tools either involve fast tweak searching (355)or scan over precalculated conformers of the candidate molecules (356). The list of prospective... [Pg.316]

Clustering problems can have numerous formulations depending on the choices for data structure, similarity/distance measure, and internal clustering criterion. This section first describes a very general formulation, then it details special cases that corresponds to two popular classes of clustering algorithms partitional and hierarchical. [Pg.135]

In this general formulation of the hierarchical clustering problem, the internal criterion J t) is calculated recursively from all the subtrees u of t. The value e(u) is sometimes called the level of the subtree u in the dendrogram. In keeping with this interpretation, e is nonincreasing along paths from the root to the leaves. [Pg.138]

Agglomerative methods, such as single link and complete link, are stepwise procedures. The formulation in (5)-(7) allows us to define the hierarchical clustering problem in terms of combinatorial optimization. To do this, however, we need an appropriate internal clustering criterion. The most obvious is squared error. [Pg.139]

We applied three hierarchical clustering methods to the same 32 data sets used in the previous section for partitional clustering (1) simulated annealing with the criterion in equation (11) (2) simulated annealing with the criterion in equation (12) and (3) Ward s algorithm. Each method generated a separate dendrogram for each data set. [Pg.151]

These results show clearly the importance of the optimization criterion to clustering. The computationally simple Ward s method performs better than the simulated annealing approach with a simplistic criterion. However, a criterion that more correctly accounts for the hierarchy, by minimizing the sum of squared error at each level, performs much better. As with partitional clustering the application of simulated annealing to hierarchical clustering requires careful selection of the internal clustering criterion. [Pg.151]

Lastly we demonstrated the use of simulated annealing on examples from multi-sensor data fusion. These examples showed the effectiveness of simulated annealing in performing both hierarchical and partitional clustering. They also showed the importance of the internal criterion to the results obtained. [Pg.153]

Our results also demonstrated how simulated annealing can help choose the most appropriate internal criterion. In both the partitional and hierarchical cases clustering performance was dramatically affected by this choice. In the partitional case our testing results showed that Barker s criterion outperformed within-cluster distance. In fact, the worst Jaccard score for Barker s criterion was better than the average Jaccard score for within-cluster distance. [Pg.153]

Applications. Reid et al. (43) performed hierarchical clustering on the earlier described micellar CE data, which resulted, as with PCA, in the distinction of opium samples from four different locations. The authors did not mention the similarity criterion used for the clustering. In the dendrogram (Fig. 13.7), a similarity value of 0.4 was set as cutoff value to distinguish the different groups. [Pg.303]


See other pages where Hierarchic criteria is mentioned: [Pg.142]    [Pg.13]    [Pg.29]    [Pg.159]    [Pg.100]    [Pg.120]    [Pg.70]    [Pg.352]    [Pg.76]    [Pg.17]    [Pg.20]    [Pg.35]    [Pg.377]    [Pg.377]    [Pg.124]    [Pg.340]    [Pg.281]    [Pg.343]    [Pg.230]    [Pg.133]    [Pg.145]    [Pg.153]    [Pg.155]    [Pg.162]    [Pg.165]    [Pg.250]    [Pg.170]    [Pg.155]    [Pg.734]    [Pg.302]    [Pg.243]    [Pg.14]    [Pg.144]   
See also in sourсe #XX -- [ Pg.141 , Pg.142 , Pg.206 , Pg.207 , Pg.210 ]




SEARCH



© 2024 chempedia.info