Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Adaptive wavelet algorithm

If some stopping criterion has been reached, then the algorithm proceeds to Step 10 where the Lawton matrix condition is verified. Provided Conditions 1 and 2 of Section 4 hold, then the Lawton matrix condition will not be satisfied for exceptional degenerate cases, thus the Lawton matrix is verified after the adaptive wavelet has been found. Finally, the multivariate statistical procedure can be performed using the coefficients X (to). The optimizer used in the adaptive wavelet algorithm is the default unconstrained MAT-LAB optimizer [12]. [Pg.189]

Before applying the adaptive wavelet algorithm, the m,q,jo and tq values need to be specified. There is no empirical rule for determining these parameters and more experimentation is required to find a suitable combination. We can however suggest some recommendations. [Pg.189]

Since m determines the number of bands in the DWT and the down-sampling factor, we choose m such that p/m j + is an integer value. [Pg.190]

It is important to recall that m combines with q to determine the number of the filter coefficients (Nf = m(q -I- 1)). The larger the value for Nf the more parameters that are required to optimized. For this reason another constraint is placed on m so that Nf does not become too large. We constrain q for similar reasons. In this book we consider setting Nf = 12 and Nf = 16. [Pg.190]


The Adaptive Wavelet Algorithm for Designing Task Specific Wavelets... [Pg.177]

Fig. 2 The adaptive wavelet algorithm for designing task. speeific wavelets. Fig. 2 The adaptive wavelet algorithm for designing task. speeific wavelets.
The adaptive wavelet algorithm outlined in Section 6 can be used for a variety of situations, and its goal is reflected by the particular criterion which is to be optimized. In this chapter, we apply the filter coefficients produced from the adaptive wavelet algorithm for discriminant analysis. It was stated earlier that the dimensionality is reduced by selecting some band(jg,xg) of wavelet coefficients from the discrete wavelet transform. It then follows that the criterion function will be based on the same coefficients i.e. Xl " (xg). [Pg.191]

Section 8 applies the adaptive wavelet algorithm to two sets of data in an attempt to further illustrate the mechanics behind the procedures. The first set of data is simulated, whilst the second considers real spectra of various kinds of minerals. The classifier that we use is Bayesian linear discriminant analysis [15]. [Pg.194]

Fig. 4 The CVQPM for the coefficients at initialization and termination of the adaptive wavelet algorithm. Optimization was based on (a) the eoefficients A (0) and (h) the... Fig. 4 The CVQPM for the coefficients at initialization and termination of the adaptive wavelet algorithm. Optimization was based on (a) the eoefficients A (0) and (h) the...
For the Wilk s Lambda criterion, optimization was based on band(3,3), while the entropy criterion optimized over band(3,2). The CVQPM criterion optimized over the scaling band(3,0). Some features which we might expect from the adaptive wavelet algorithm, is that at termination, the band on which optimization was based would outperform the other bands, at least in... [Pg.197]

Table 3. The percentage of correctly classified spectra, using the coefficients X (t) for T = 0,..., 3 at initialization and at termination of the adaptive wavelet algorithm. The discriminant criterion functions were Wilk s Lambda, symmetric entropy and the CVQPM. Table 3. The percentage of correctly classified spectra, using the coefficients X (t) for T = 0,..., 3 at initialization and at termination of the adaptive wavelet algorithm. The discriminant criterion functions were Wilk s Lambda, symmetric entropy and the CVQPM.
There are several items regarding the adaptive wavelet algorithm which warrant further discussion. These items are now considered separately. [Pg.199]

Constrained optimization versus unconstrained optimization. In the adaptive wavelet algorithm, it was possible to avoid using constraints which ensured orthogonality. This is due to some clever algebraic factorizations of the wavelet matrix for which much credit is due to [6]. However, one constraint which we have not discussed in very much... [Pg.200]

Validation without an independent test set. Each application of the adaptive wavelet algorithm has been applied to a training set and validated using an independent test set. If there are too few observations to allow for an independent testing and training data set, then cross validation could be used to assess the prediction performance of the statistical method. Should this be the situation, it is necessary to mention that it would be an extremely computational exercise to implement a full cross-validation routine for the AWA. That is. it would be too time consuming to leave out one observation, build the AWA model, predict the deleted observation, and then repeat this leave-one-out procedure separately. In the absence of an independent test set, a more realistic approach would be to perform cross-validation using the wavelet produced at termination of the AWA, but it is important to mention that this would not be a full validation. [Pg.200]

This chapter demonstrates how the adaptive wavelet algorithm of Chapter 8 can be implemented in conjunction with classification analysis and regression methods. The data used in each of these applications are spectral data sets where the reflectance/absorbance of substances are measured at regular increments in the wavelength domain. [Pg.437]

Classification criterion functions for the adaptive wavelet algorithm... [Pg.440]

The adaptive wavelet algorithm is applied to three spectral data sets. The dimensionality of each data set is p = 512 variables. The data sets will be referred to as the seagrass, paraxylene and butanol data. The number of training and testing spectra in the group categories is listed in Table 1 for each set of data. [Pg.442]

In this section, we design our own task specific filter coefficients using the adaptive wavelet algorithm of Chapter 8. The idea behind the adaptive wavelet algorithm is to avoid the decision of which set of filter coefficients and hence the wavelet family which would be best suited to our data. Instead, we basis design our own wavelets or more specifically, the filter coefficients which define the wavelet and scaling function. This is done to suit the current task at hand, which in this case is discriminant analysis. [Pg.444]

The discriminant criterion function implemented by the adaptive wavelet algorithm is the CVQPM criterion function discussed in Section 1.4. The adaptive wavelet algorithm is applied using several settings of the m, q and jo... [Pg.444]

We will test the performance of the adaptive wavelet algorithm for regression purposes using an independent test set. For this reason we have decided to formulate an R- measure for the test set which is denoted by R, ... [Pg.451]

A suitable criterion function for regression analysis should reflect how well the response values are predicted. In the adaptive wavelet algorithm, the criterion function considered for regression is based on the PRESS statistic and is then converted to a leave-one-out cross-validated R-squared measure as follows... [Pg.452]


See other pages where Adaptive wavelet algorithm is mentioned: [Pg.177]    [Pg.178]    [Pg.178]    [Pg.189]    [Pg.189]    [Pg.194]    [Pg.441]    [Pg.453]   
See also in sourсe #XX -- [ Pg.177 , Pg.189 , Pg.194 , Pg.199 , Pg.200 , Pg.440 , Pg.442 ]




SEARCH



Adaptive algorithm

Adaptive wavelets

Classification criterion functions for the adaptive wavelet algorithm

Regression criterion functions for the adaptive wavelet algorithm

© 2024 chempedia.info