Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Alternative variance models

The most robust method for identifying alterations in transcript levels is the WRST. This method does not make assumptions concerning the distribution or the variance of the data to be compared. However, many investigators use the /-test due to its simplicity. Alternatively, a model-based /-test from the GLMM could be used which performs contrast tests between two groups of interest, such as an effect at a particular dose vs. a vehicle effect, much like a /-test. The values for the test come from the model fit to the data, and provide a better estimate of the difference between the two groups, as other data are used to better constrain the real effect of the dose and vehicle under consideration. [Pg.546]

An independent method to identify the stochastic errors of impedance data is described in Chapter 21. An alternative approach has been to use the method of maximum likelihood, in which the regression procedure is used to obtain a joint estimate for the parameter vector P and the error structure of the data. The maximum likelihood method is recommended under conditions where the error structure is unknown, but the error structure obtained by simultaneous regression is severely constrained by the assumed form of the error-variance model. In addition, the assumption that the error variance model can be obtained by minimizing the objective function ignores the differences eimong the contributions to the residual errors shown in Chapter 21. Finedly, the use of the regression procedure to estimate the standard deviation of the data precludes use of the statistic... [Pg.382]

A third conclusion is that the relative magnitude of the residual variances of the models is a major issue when detecting differences between them. When a model is clearly different from its alternative (the model is much worse because its residual variance is 900%, 100% or 50% higher), Mandel s test will statistically detect it, providing we have a reasonable number of calibrators, around 20 standards (at the 99% confidence level). Recall that the simplified test is independent of the dof and, therefore, it will always yield wrong conclusions. [Pg.129]

In the following, a detailed exposition of Bartell s (1961a) theory of steric isotope effects will be given (Section II, A), and an alternative model will be developed, based on somewhat different assumptions about the timing in the transition state, which leads to predictions at variance with the experimental results (Section II, B). In both of these sub-sections, special reference will be made to the work of Melander and Carter (1964). Finally, a selective non-comprehensive review of other experimental work in this field will be presented (Section III). [Pg.5]

The administrators or users of the study results must supply the objectives and required precision. Statisticians can develop the models for alternative sampling strategies. The estimates of variance components and costs can come from a number of places ... [Pg.90]

The results of such multiple paired comparison tests are usually analyzed with Friedman s rank sum test [4] or with more sophisticated methods, e.g. the one using the Bradley-Terry model [5]. A good introduction to the theory and applications of paired comparison tests is David [6]. Since Friedman s rank sum test is based on less restrictive, ordering assumptions it is a robust alternative to two-way analysis of variance which rests upon the normality assumption. For each panellist (and presentation) the three products are scored, i.e. a product gets a score 1,2 or 3, when it is preferred twice, once or not at all, respectively. The rank scores are summed for each product i. One then tests the hypothesis that this result could be obtained under the null hypothesis that there is no difference between the three products and that the ranks were assigned randomly. Friedman s test statistic for this reads... [Pg.425]

Importantly, (15a) also caused mild hyperthermia, a side-effect that might well arise with other TRPVl-antagonists, and that presumably stems from the inactivation of a constitutionally active endogenous vanilloid pathway. This model is at variance with knock-out (k.o.) TRPVl mice that have normal temperature. Animals born without a certain receptor, however, frequently adapt by developing alternative pathways. Clearly, these observations are worth further investigation for example by using conditional TRPVl k.o. animals. [Pg.158]

The competition model and solvent interaction model were at one time heatedly debated but current thinking maintains that under defined r iitions the two theories are equivalent, however, it is impossible to distinguish between then on the basis of experimental retention data alone [231,249]. Based on the measurement of solute and solvent activity coefficients it was concluded that both models operate alternately. At higher solvent B concentrations, the competition effect diminishes, since under these conditions the solute molecule can enter the Interfacial layer without displacing solvent molecules. The competition model, in its expanded form, is more general, and can be used to derive the principal results of the solvent interaction model as a special case. In essence, it seems that the end result is the same, only the tenet that surface adsorption or solvent association are the dominant retention interactions remain at variance. [Pg.708]

An alternative to the slope approach to determining the appropriate value of n for use in model calculations is based on a determination of the variance of the response of the actual... [Pg.407]

PCR is an alternative method to the much more used regression method PLS (Section 4.7). PCR is a strictly defined method and the model often gives a very similar performance as a PLS model. Usually PCR needs more components than PLS because no information ofy is used for the computation of the PCA scores this is not necessarily a disadvantage because more variance of X is considered and the model may gain stability. [Pg.163]

In the text which follows we shall examine in numerical detail the decision levels and detection limits for the Fenval-erate calibration data set ( set-B ) provided by D. Kurtz (17). In order to calculate said detection limits it was necessary to assign and fit models both to the variance as a function of concentration and the response (i.e., calibration curve) as a function of concentration. No simple model (2, 3 parameter) was found that was consistent with the empirical calibration curve and the replication error, so several alternative simple functions were used to illustrate the approach for calibration curve detection limits. A more appropriate treatment would require a new design including real blanks and Fenvalerate standards spanning the region from zero to a few times the detection limit. Detailed calculations are given in the Appendix and summarized in Table V. [Pg.58]

Two procedures for improving precision in calibration curve-based-analysis are described. A multiple curve procedure is used to compensate for poor mathematical models. A weighted least squares procedure is used to compensate for non-constant variance. Confidence band statistics are used to choose between alternative calibration strategies and to measure precision and dynamic range. [Pg.115]

Generalized Covariance Models. When l x) is an intrinsic random function of order k, an alternative to the semi-variogram is the generalized covariance (GC) function of order k. Like the semi-variogram model, the GC model must be a conditionally positive definite function so that the variance of the linear functional of ZU) is greater than or equal to zero. The family of polynomial GC functions satisfy this requirement. The polynomial GC of order k is... [Pg.216]

In Other words, the assurance of quality by measurement of process impurities in the end product has been replaced by assurance of quality by the removal of variance in the process (by continuous monitoring of a continuous process). Naturally, whether online process analysis is being used as a surrogate for an alternative off-line technique to measure specific analytes or as a monitor to reduce process variance it needs calibration and validation. These stages require measurement of process analytes by a reference off-line technique, usually HPLC, and subsequent demonstration that the resulting calibration model has reliable predictive power. [Pg.252]

As remarked in Approach 1, a potential complication with Expectation Model I lies in computing a suitable range of values for the operational risk factor 0j. Therefore, an alternative formulation of minimizing variance while adding a target profit constraint is employed for Expectation Model II ... [Pg.119]

Konno and Yamazaki (1991) proposed a large-scale portfolio optimization model based on mean-absolute deviation (MAD). This serves as an alternative measure of risk to the standard Markowitz s MV approach, which models risk by the variance of the rate of return of a portfolio, leading to a nonlinear convex quadratic programming (QP) problem. Although both measures are almost equivalent from a mathematical point-of-view, they are substantially different computationally in a few perspectives, as highlighted by Konno and Wijayanayake (2002) and Konno and Koshizuka (2005). In practice, MAD is used due to its computationally-attractive linear property. [Pg.120]

As we have already seen, there will be settings where the pattern of differences between treatment groups does not conform to proportional hazards, where the hazard ratio is not a constant, single value. Such situations are best handled by using an alternative model to incorporate baseline factors. The accelerated failure time model is an analysis of variance technique which models the survival time itself, but on the log scale ... [Pg.207]

Up to now the technique of calculations in analysis of variance has been analyzed in more detail. Now let us briefly consider the analysis of variance theory. Let us consider the model for a one-way analysis of variance. Here it is assumed that the columns of data are J-random samples from J-independent normal populations with means i, i2,...,P, and common variance a2. The one-way analysis of variance technique will give us a procedure for testing the hypothesis H0 F.i=p.2=---=F-j against the alternative Hj at least two ij not equal. The statistical model gives us the structure of each observation in the IxJ matrix ... [Pg.72]

The conjugated polyene, [18]annulene, IV, is a potential 4m+ 2 aromatic system. Because of steric repulsion between hydrogen atoms inside the ring, the molecule is distorted away from planarity. Nevertheless, its NMR shielding effects indicate an induced ring current in the mean molecular plane, once more in line with the decisive role of o-a-m. All evidence derived from the exclusion principle, conformational rigidity, aromaticity, electronic spectra and NMR shielding is therefore consistent with the alternative picture, and at variance with the conventional model. [Pg.222]

Principal component analysis (PCA) and multivariate curve resolution-alternating least squares (MCR-ALS) were applied to the augmented columnwise data matrix D1"1", as shown in Figure 11.16. In both cases, a linear mixture model was assumed to explain the observed data variance using a reduced number of contamination sources. The bilinear data matrix decomposition used in both cases can be written by Equation 11.19 ... [Pg.456]


See other pages where Alternative variance models is mentioned: [Pg.154]    [Pg.154]    [Pg.139]    [Pg.125]    [Pg.242]    [Pg.102]    [Pg.127]    [Pg.373]    [Pg.342]    [Pg.59]    [Pg.128]    [Pg.251]    [Pg.141]    [Pg.78]    [Pg.397]    [Pg.403]    [Pg.467]    [Pg.77]    [Pg.189]    [Pg.90]    [Pg.92]    [Pg.143]    [Pg.127]    [Pg.93]    [Pg.60]    [Pg.316]    [Pg.49]    [Pg.482]    [Pg.42]    [Pg.150]    [Pg.153]   
See also in sourсe #XX -- [ Pg.154 ]




SEARCH



Alternate models

Alternative models

Variance model

© 2024 chempedia.info