Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Regression statistical approach

Beside mid-IR, near-IR spectroscopy has been used to quantitate polymorphs at the bulk and dosage product level. For SC-25469 [34], two polymorphic forms were discovered (a and /3), and the /3-form was selected for use in the solid dosage form. Since the /3-form can be transformed to the a-form under pressure by enantiotropy, quantitation of the /3-form in the solid dosage formulation was necessary. Standard mixtures of both forms in the formulation matrix were prepared, and spectra were measured in the near-IR via diffuse reflectance. Utilizing a standard, near-IR multiple linear regression, statistical approach, the a- and /3-forms could be predicted to within 1% of theoretical. This extension of the diffuse reflectance IR technique shows that quantitation of polymorphic forms at the bulk and/or dosage product level can be performed. [Pg.74]

An approach that is sometimes helpful, particularly for recent pesticide risk assessments, is to use the parameter values that result in best fit (in the sense of LS), comparing the fitted cdf to the cdf of the empirical distribution. In some cases, such as when fitting a log-normal distribution, formulae from linear regression can be used after transformations are applied to linearize the cdf. In other cases, the residual SS is minimized using numerical optimization, i.e., one uses nonlinear regression. This approach seems reasonable for point estimation. However, the statistical assumptions that would often be invoked to justify LS regression will not be met in this application. Therefore the use of any additional regression results (beyond the point estimates) is questionable. If there is a need to provide standard errors or confidence intervals for the estimates, bootstrap procedures are recommended. [Pg.43]

To test the applicability of statistical techniques for determination of the species contributions to the scattering coefficient, a one-year study was conducted in 1979 at China Lake, California. Filter samples of aerosol particles smaller than 2 ym aerodynamic diameter were analyzed for total fine mass, major chemical species, and the time average particle absorption coefficient, bg. At the same time and location, bgp was measured with a sensitive nephelometer. A total of 61 samples were analyzed. Multiple regression analysis was applied to the average particle scattering coefficient and mass concentrations for each filter sample to estimate aj and each species contribution to light scattering, bgn-j. Supplementary measurements of the chemical-size distribution were used for theoretical estimates of each b pj as a test of the effectiveness of the statistical approach. [Pg.128]

There are other statistical approaches that can be used to detect and eliminate outliers. In summary, an outlier should be rejected if it is outside the 95% confidence limit of regression line. [Pg.123]

The normal and acceptable statistical approach for analyzing quantitative properties that change over time is to calculate the time it takes for the 95% one-sided confidence limit for the mean degradation curve to intersect the acceptable specification limit. If the data show that batch-to-batch variability is small, it may be worthwhile to combine the data into one overall estimate. This can be done by first applying the appropriate statistical tests to the slopes of the regression lines and zero time intercepts for the individual batches. If the data from the individual batches cannot be combined, the shortest time interval any batch remains within acceptable limits may determine the overall re-test period. [Pg.471]

New techniques for data analysis abound in statistical literature. GAM is a powerful tool technique, and a full historical account of GAM with ample references can be found in the research monograph of Hastie and Tibshirani (15). GAM is closer to a reparameterization of the model than a reexpression of the response. Once an additive model is fitted to the data, one can plot their p coordinate functions separately to examine the roles of predictors in modeling response. With the GAM approach the dependence of a parameter (P) on covariates (predictors) Xi,..., Xp are modeled. Usually, the multiple linear regression (MLR) approach is the method of choice for this type of problem. The MLR model is expressed in the following form ... [Pg.388]

Finally, a statistical approach, such as stepwise regression (if endpoint is continuous) or discriminant analyses (if the endpoint is categorical) to verify the quality of fit. [Pg.47]

The next covariate screening approach would be to use a regression-based method and take a more rigorous statistical approach to the problem. Using the generalized additive model (GAM) procedure in SAS, a LOESS smooth was applied to the continuous covariates wherein the procedure was allowed to identify the optimal smoothing parameter for each covariate tested. Two dependent variables were examined t, and the EBE for CL. To avoid possible skewness in the residuals,... [Pg.322]

The second and preferred method is to apply appropriate statistical analysis to the dataset, based on linear regression. Both EU and USFDA authorities assume log-linear decline of residue concentrations and apply least-squares regression to derive the fitted depletion line. Then the one-sided upper tolerance limit (95% in EU and 99% in USA) with a 95% confidence level is computed. The WhT is the time when this upper one-sided 95% tolerance limit for the residue is below the MRL with 95% confidence. In other words, this definition of the WhT says that at least 95% of the population in EU (or 99% in USA) is covered in an average of 95% of cases. It should be stressed that the nominal statistical risk that is fixed by regulatory authorities should be viewed as a statistical protection of farmers who actually observe the WhT and not a supplementary safety factor to protect the consumer even if consumers indirectly benefit from this rather conservative statistical approach. [Pg.92]

All developments of quantitative structure activity relationships (QSARs)/ quantitative structure-property relationships (QSPRs)/QSDRs go through similar steps (1) collection of a database of measured values for model development and validation/evaluation, (2) selection of chemical descriptors (can include connection indices, atom, bond, or functional groups, molecular orbital calculations), (3) development of the model (develop a correlation between the chemical descriptors and the activity/property/degradation values) using a variety of statistical approaches (linear and non-linear regression, neural networks, partial least squares (PLS), etc. [9]), and (4) validate/evaluate the model for predictability (usually try to use a separate set of chemicals other than the ones used to train the model external validation) [10]. [Pg.25]

Once the descriptors have been selected, investigators need to select the statistical approach for developing the QSAR model. This can involve a number of techniques, such as multiple linear regression, partial least squares analysis, neural networks, and a variety of others [9]. These techniques need to be applied to both the training set (model development) and the validation set (assessment of predictability). [Pg.26]


See other pages where Regression statistical approach is mentioned: [Pg.247]    [Pg.59]    [Pg.512]    [Pg.350]    [Pg.12]    [Pg.213]    [Pg.457]    [Pg.361]    [Pg.362]    [Pg.74]    [Pg.75]    [Pg.418]    [Pg.88]    [Pg.145]    [Pg.473]    [Pg.182]    [Pg.220]    [Pg.385]    [Pg.358]    [Pg.920]    [Pg.368]    [Pg.85]    [Pg.212]    [Pg.968]    [Pg.249]    [Pg.212]    [Pg.511]    [Pg.169]    [Pg.91]    [Pg.671]    [Pg.312]    [Pg.345]    [Pg.154]    [Pg.246]    [Pg.463]    [Pg.38]    [Pg.282]    [Pg.100]   
See also in sourсe #XX -- [ Pg.154 ]




SEARCH



Statistical regression

© 2024 chempedia.info