Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Functional analysis of variance

An experiment involving a complex computer model or code may have tens or even hundreds of input variables and, hence, the identification of the more important variables (screening) is often crucial. Methods are described for decomposing a complex input-output relationship into effects. Effects are more easily understood because each is due to only one or a small number of input variables. They can be assessed for importance either visually or via a functional analysis of variance. Effects are estimated from flexible approximations to the input-output relationships of the computer model. This allows complex nonlinear and interaction relationships to be identified. The methodology is demonstrated on a computer model of the relationship between environmental policy and the world economy. [Pg.308]

The effects (11) are orthogonal with respect to the weight function w(x), leading to a decomposition of the total variance of y(jc), called the ANOVA decomposition or functional analysis of variance as follows,... [Pg.316]

A quantitative measure of the importance of any effect, and hence the associated variables, follows from the percentage contribution of each term on the right-hand side to the total variance on the left. The functional analysis of variance (ANOVA) in (14) goes back at least as far as Hoeffding (1948). [Pg.316]

Using the estimated corrected effects, compute the estimated contributions in the functional analysis of variance (14). [Pg.318]

The functional analysis of variance in (14) is then computed from the estimated corrected effects. Here, the 41 main effects and 820 two-factor-interaction effects together account for about 89% of the total variance of the predictor. Hence, about 11 % of the predictor s total variability is due to higher-order effects. Table 1 shows the estimated main effects and interaction effects that contribute at least 1% to the functional ANOVA These 12 effects together account for about 74% of the total variation. Only six variables appear in these 12 effects they are described in Table 2. [Pg.319]

One-way analysis of variance, 229-230, 230f—231f Operational model derivation of, 54-55 description of, 45—47, 46f function for variable slope, 55 for inverse agonists, 221 of agonism, 47f orthosteric antagonism, 222 partial agonists with, 124, 220-221 Opium, 147 Orphan receptors, 180 Orthosteric antagonism... [Pg.297]

The R-squared value, which indicates how well the three chosen parameters account for the variability in the yield, was 84.2%. The analysis of variance indicates that only temperature and pressure (both P-value = 0.026) have significant impact at 90% confidence level. The P-value of 0.37 for [NaOH] indicates that, within the parameter space examined, the concentration of NaOH does not significantly affect the cyclohexanone yield. Based on the above equation, one can predict the cyclohexanone yield at any given condition within the parameter space chosen. Since [NaOH] does not have a significant effect on the yield, one can fix its value and plot the yield of cyclohexanone as a function of temperature and pressure (Figure 1). [Pg.199]

This Worksheet demonstrates using Mathcad s F distribution function and programming operators to conduct an analysis of variance (ANOVA) test. [Pg.210]

The first is to normalize the data, making them suitable for analysis by our most common parametric techniques such as analysis of variance ANOYA. A simple test of whether a selected transformation will yield a distribution of data which satisfies the underlying assumptions for ANOYA is to plot the cumulative distribution of samples on probability paper (that is a commercially available paper which has the probability function scale as one axis). One can then alter the scale of the second axis (that is, the axis other than the one which is on a probability scale) from linear to any other (logarithmic, reciprocal, square root, etc.) and see if a previously curved line indicating a skewed distribution becomes linear to indicate normality. The slope of the transformed line gives us an estimate of the standard deviation. If... [Pg.906]

The principle of multivariate analysis of variance and discriminant analysis (MVDA) consists in testing the differences between a priori classes (MANOVA) and their maximum separation by modeling (MDA). The variance between the classes will be maximized and the variance within the classes will be minimized by simultaneous consideration of all observed features. The classification of new objects into the a priori classes, i.e. the reclassification of the learning data set of the objects, takes place according to the values of discriminant functions. These discriminant functions are linear combinations of the optimum set of the original features for class separation. The mathematical fundamentals of the MVDA are explained in Section 5.6. [Pg.332]

Fig. 9-9 demonstrates the results of MVDA for the three investigated territories in the plane of the computed two discriminant functions. The separation line corresponds to the limits of discrimination for the highest probability. The results prove that good separation of the three territories with a similar geological background is possible by means of discriminant analysis. The misclassification rate amounts to 13.0%. The scattering radii of the 5% risk of error of the multivariate analysis of variance overlap considerably. They demonstrate also that the differences in the multivariate data structure of the three territories are only small. [Pg.332]

We view the real or the simulated system as a black box that transforms inputs into outputs. Experiments with such a system are often analyzed through an approximating regression or analysis of variance model. Other types of approximating models include those for Kriging, neural nets, radial basis functions, and various types of splines. We call such approximating models metamodels other names include auxiliary models, emulators, and response surfaces. The simulation itself is a model of some real-world system. The goal is to build a parsimonious metamodel that describes the input-output relationship in simple terms. [Pg.288]

Gu and Wahba (1993) used a smoothing-spline approach with some similarities to the method described in this chapter, albeit in a context where random error is present. They approximated main effects and some specified two-variable interaction effects by spline functions. Their example had only three explanatory variables, so screening was not an issue. Nonetheless, their approach parallels the methodology we describe in this chapter, with a decomposition of a function into effects due to small numbers of variables, visualization of the effects, and an analysis of variance (ANOVA) decomposition of the total function variability. [Pg.311]

Parametric data were presented as mean SD. To determine differences in glutamate concentrations, a repeated-measures analysis of variance was performed. The cutaneous sensation, hind-limb motor function, and morphological changes of the spinal cord were analyzed with a non-parametric method (Kruskal-Wallis test) followed by the Mann-Whitney U-test. [Pg.204]

The natural logarithms of one-tenth of the throughput was selected as being rile most convenient function to handle. Residual variances of 0.04489, 0.22450, and 0.08891 were obtained, each with 18 d ees of freedom, for each of the three acid types as before. While not stricdy condstent, these are much mqre so than the simple variable. The Analysis of Variance on the logarithm of the data is In fable 12.13. [Pg.127]

After adjustment, a statistical analysis of variance is applied to the three parameters of the symmetric logistic function, M, a. b. [Pg.1622]

Like the mass flows, the energy flows have been transformed into a growth function, to which the symmetric logistic function is adjusted. The parameters of this function, M, a, b, are analysed by means of an analysis of variance. These parameters do not correspond to a production of mass but to a production of energy (kJ). [Pg.1622]

Prediction of the log reduction of an inoculated organism as a function of acid concentration, time, and temperature can also be done by a mathematical model developed for this purpose, using the second-order polynomial equation to fit the data. The following tests justified the reliability of the model the analysis of variance for the response variable indicated that the model was significant (P < 0.05 and R2 = 0.9493) and had no significant lack of fit (P > 0.05). Assumptions underlying the ANOVA test were also investigated and it was demonstrated that with the normal probability plot of residuals, plot of residuals versus estimated values for the responses, and plot of residuals versus random order of runs, that the residuals satisfied the assumptions of normality, independence, and randomness (Jimenez et al., 2005). [Pg.235]

Analysis of variance for each dependent variable showed that in almost all cases, R2 coefficients higher than 0.83 were obtained (Table 2), which means that models were able to explain more than 83% of the observed responses. For the rate of gelation, thermal hysteresis and hardness, the lack of fit test was not significant. For Tge, and Tm, the lack of fit was significant, which means that the model may not have included all appropiate function of independent variables. According to Box and Draper,13 we considered the high coefficients R2 as evidence of the applicability of the model. [Pg.193]

To this point our discussions have largely focused on the application of matrices to linear problems associated with simultaneous equations, applications that commonly arise in least-square, multiple regression techniques. One further important function that occurs in multivariate analysis and the analysis of variance is the quadratic form. [Pg.219]


See other pages where Functional analysis of variance is mentioned: [Pg.2]    [Pg.2]    [Pg.231]    [Pg.441]    [Pg.454]    [Pg.207]    [Pg.425]    [Pg.201]    [Pg.674]    [Pg.109]    [Pg.195]    [Pg.258]    [Pg.349]    [Pg.336]    [Pg.203]    [Pg.106]    [Pg.454]    [Pg.213]    [Pg.373]    [Pg.279]    [Pg.139]    [Pg.119]    [Pg.412]    [Pg.91]    [Pg.97]   
See also in sourсe #XX -- [ Pg.311 , Pg.316 , Pg.318 , Pg.319 ]




SEARCH



Analysis of variance

Function variance

Functional analysis

Functions analysis

Variance analysis

© 2024 chempedia.info