Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Decision-tree clustering

As well shall see in Sections 15.2.4 and 16.4.1, decision tree clustering is a key component of a number of s mthesis systems. In these, the decision is not seen so much as a clustering algorithm, but rather as a mapping or function from the discrete feature space to the acoustic space. As this is fully defined for every possible feature combination, it provides a general mechanism for generating acoustic representations from linguistic ones. [Pg.468]

Decision-tree clustering is used to merge parameters for low- and zero-occupancy... [Pg.472]

Kohonen network Conceptual clustering Principal Component Analysis (PCA) Decision trees Partial Least Squares (PLS) Multiple Linear Regression (MLR) Counter-propagation networks Back-propagation networks Genetic algorithms (GA)... [Pg.442]

Monothetic divisive clustering has largely been ignored, although there have been applications and development of a classification method closely related to monothetic divisive clustering. This classification is recursive partitioning, a type of decision tree method. [Pg.17]

The impurity of a cluster can be formulated in a number of ways as we shall see in Section 15.1.9. Here even a simple measure will do, for example the ratio of word-1 over word-2. The stopping criteria usually involves specifying a minimum decrease in impurity. The decision tree gets round the problem of modelling all the feature combinations by effectively clustering certain feature combinations together. This is not always ideal, but it certainly does allow for more accurate modelling than naive Bayes, as we can see that the same feature can appear in different parts of the tree and have a different affect on the outcome. [Pg.89]

We solve this by making use of some of the common properties of phones, and the most common way of doing this is to use the phones distinctive features (see Section 7.4.3). In doing so, we are for instance positing that phones which share the same place of articulation may have more similar acoustic realisations than ones which don t. The most common way of performing this feature based clustering is to use a decision tree the clever thing about this is that while we... [Pg.464]

Stopping criteria usually involve specifying a minimum decrease in impurity, and that clusters should have a minimum occupancy (say 10 data points). The process of examining and splitting clusters is shown in Figures 15.9, 15.10, and 15.11. A trained decision tree is shown in Figure 15.12. [Pg.467]

An important point about the decision tree grown in this way is that it provides a cluster for every feature combination, not just those encountered in the training data. So see this, consider the tree in Figure 15.12. One branch of this has the feature set... [Pg.467]

We have some degree of choice as to how we actually use the decision tree to form the target function. Firstly, we could just accept the units in the chosen cluster as candidates and leave the process at that. An addition to this is to score each unit in terms of its distance from the cluster mean [54], in an attempt to somehow reward typical units within the cluster. If the cluster size is small however, these approaches may lead to only a small number of units being used as candidates. Alternatives include going back up the tree and accepting leaves which are... [Pg.506]

Cambridge University and IBM The HMM system developed by Rob Donovan initially at Cambridge University [140], and then at IBM, is notable as one of the systems independent Irom the ATR family. It was based on Cambridge University s HTK ASR system, and used decision trees to segment and cluster state sized units [138], [150], [196]. Particularly interesting recent developments have concerned expressiveness and emotion in text-to-speech [151] [195]. [Pg.526]


See other pages where Decision-tree clustering is mentioned: [Pg.477]    [Pg.505]    [Pg.466]    [Pg.494]    [Pg.477]    [Pg.505]    [Pg.466]    [Pg.494]    [Pg.683]    [Pg.333]    [Pg.20]    [Pg.36]    [Pg.40]    [Pg.213]    [Pg.122]    [Pg.402]    [Pg.31]    [Pg.185]    [Pg.141]    [Pg.671]    [Pg.688]    [Pg.688]    [Pg.466]    [Pg.482]    [Pg.505]    [Pg.507]    [Pg.508]    [Pg.508]    [Pg.171]    [Pg.451]    [Pg.451]    [Pg.471]    [Pg.495]    [Pg.495]    [Pg.496]    [Pg.496]   
See also in sourсe #XX -- [ Pg.6 , Pg.494 ]




SEARCH



Decision trees

© 2024 chempedia.info