Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Generalized Least Squares GLS Estimation

In this case we minimize a weighted SSE with non-constant weights. The user-supplied weighting matrices differ from experiment to experiment. [Pg.15]

Of course, it is not at all clear how one should select the weighting matrices Q i=l.N, even for cases where a constant weighting matrix Q is used. Practical guidelines for the selection of Q can be derived from Maximum Likelihood (ML) considerations. [Pg.15]


The particular choice of a residual variance model should be based on the nature of the response function. Sometimes 4> is unknown and must be estimated from the data. Once a structural model and residual variance model is chosen, the choice then becomes how to estimate 0, the structural model parameters, and <, the residual variance model parameters. One commonly advocated method is the method of generalized least-squares (GLS). First it will be assumed that < is known and then that assumption will be relaxed. In the simplest case, assume that 0 is known, in which case the weights are given by... [Pg.132]

The model estimates were obtained using FOCE-I. Of interest was how these values would compare using a different estimation algorithm. Hence, the model was estimated using FO-approximation, FOCE, Laplacian, and generalized least-squares (GLS). The results are shown in Table 9.18. FOCE and Laplacian produced essentially the same results. Similarly, FOCE-I and GLS produced essentially the same results as well. Conditional methods tend to produce different results from... [Pg.334]

FORTRAN source code in which the maximum likelihood is evaluated with one of two different first-order expansions (FO or FOCE) and a second-order expansion about the conditional estimates of the random effects (Laplacian) S-PLUS algorithm utilizing a generalized least-squares (GLS) procedure and Taylor series expansion about the conditional estimates of the interindividual random effects... [Pg.329]

Least squares (LS) estimation minimizes the sum of squared deviations, comparing observed values to values predicted by a curve with particular parameter values. Weighted LS (WLS) can take into account differences in the variances of residuals generalized LS (GLS) can take into account covariances of residuals as well as differences in weights. Cases of LS estimation include the following ... [Pg.35]

Within NONMEM, a generalized least-squares-like (GLS-like) estimation algorithm can be developed by iterating separate, sequential models. In the first step, the model is fit using one of the estimation algorithms (FO-approximation, FOCE, etc.). The individual predicted values are saved in a data set that is formatted the same as the input data set, i.e., the output data set contains the original data set plus one more variable the individual predicted values. The second step then models the residual error based on the value of the individual predicted values given in the previous step. So, for example, suppose the residual error was modeled as a proportional error model... [Pg.230]


See other pages where Generalized Least Squares GLS Estimation is mentioned: [Pg.15]    [Pg.27]    [Pg.87]    [Pg.36]    [Pg.48]    [Pg.108]    [Pg.206]    [Pg.232]    [Pg.220]    [Pg.428]    [Pg.15]    [Pg.27]    [Pg.87]    [Pg.36]    [Pg.48]    [Pg.108]    [Pg.206]    [Pg.232]    [Pg.220]    [Pg.428]    [Pg.356]    [Pg.355]   


SEARCH



Estimate least squares

General Least Squares

Generalized least squares

Least estimate

© 2024 chempedia.info