Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Confidence statistic approaches

To put equation 44-6 into a usable form under the conditions we wish to consider, we could start from any of several points of view the statistical approach of Hald (see [10], pp. 115-118), for example, which starts from fundamental probabilistic considerations and also derives confidence intervals (albeit for various special cases only) the mathematical approach (e.g., [11], pp. 550-554) or the Propagation of Uncertainties approach of Ingle and Crouch ([12], p. 548). In as much as any of these starting points will arrive at the same result when done properly, the choice of how to attack an equation such as equation 44-6 is a matter of familiarity, simplicity and to some extent, taste. [Pg.254]

Alternative methods and algorithms may be used, such as the model-independent approach to compare similarity limits derived from multi-variate statistical differences (MSD) combined with a 90% confidence interval approach for test and reference batches (21). Model-dependent approaches such as the Weibull function use the comparison of parameters obtained after curve fitting of dissolution profiles. See Chapters 8 and 9 for further discussion of these methods. [Pg.336]

Confidence Band Statistics. The confidence-band statistical approach is described in texts by Natrella (1) and Miller 2) and in three papers from our laboratory (3-5). A computer program, REGRES, (See Appendix) was used to carry out all the computations described in this paper. [Pg.119]

It is not the purpose of this paper at this moment to investigate further for more detailed reasons for discrepancies in confidence bands or estimated amount intervals. That will be investigated fully at a later time. I do wish to point out that the assumptions one makes about the information he has and the statistical approaches he makes profoundly affect the resultant error calculations. Far from being a staid and dormant subject matter, statistical estimations of error are currently very actively being studied in order for scientific workers and citizens alike to be informed about the error in their work. [Pg.193]

There are other statistical approaches that can be used to detect and eliminate outliers. In summary, an outlier should be rejected if it is outside the 95% confidence limit of regression line. [Pg.123]

Step 6 allows us to create a statistical approach for the evaluation of the collected data. Using a statistical test and the statistical parameters selected in Step 6, we will be able to control decision error and make decisions with a certain level of confidence. Decision error, like total error, can only be minimized, but never entirely eliminated. However, we can control this error by setting a tolerable level of risk of an incorrect decision. Conducting Step 6 enables the planning team to specify acceptable probabilities of making an error (the tolerable limits on decision errors). At this step of the DQO process, the project team will address the following issues ... [Pg.23]

Despite the fact that it is difficult to define a relationship between PK parameters and measures of hepatic function, the most appropriate statistical approach is to calculate geometric means and 95% confidence intervals to compare the healthy and impaired groups (see example). Investigation of the relationships between hepatic functional abnormalities and selected PK parameters using linear and non linear models in order to derive dose recommendations are an appropriate alternative, yet, in spite of many constraints. [Pg.696]

Numerous reports are available [19,229-248] on the development and analysis of the different procedures of estimating the reactivity ratio from the experimental data obtained over a wide range of conversions. These procedures employ different modifications of the integrated form of the copolymerization equation. For example, intersection [19,229,231,235], (KT) [236,240], (YBR) [235], and other [242] linear least-squares procedures have been developed for the treatment of initial polymer composition data. Naturally, the application of the non-linear procedures allows one to obtain more accurate estimates of the reactivity ratios. However, majority of the calculation procedures suffers from the fact that the measurement errors of the independent variable (the monomer feed composition) are not considered. This simplification can lead in certain cases to significant errors in the estimated kinetic parameters [239]. Special methods [238, 239, 241, 247] were developed to avoid these difficulties. One of them called error-in-variables method (EVM) [239, 241, 247] seems to be the best. EVM implies a statistical approach to the general problem of estimating parameters in mathematical models when the errors in all measured variables are taken into account. Though this method requires more information than do ordinary non-linear least-squares procedures, it provides more reliable estimates of rt and r2 as well as their confidence limits. [Pg.61]

In Eq. (5.21) S is the statistical average of values for many microscopic states. If we were concerned with but a few particles distributed over a few states, the statistical average would hot be needed, because we could specify the possible distributions of the particles over the states. However, for large collections of molecules and their many possible quantum states, the statistical approach is mandatory. Indeed, the concept we have used is not appropriate unless large numbers are involved. For example, if but two molecules (instead of No) were distributed between the sections, we could not assume with any confidence an equal number of molecules in each section. For a significant fraction of the time there would be two molecules in one section and none in the other. [Pg.453]

The statistical evaluation of bioequivalence studies should be based on confidence interval estimation rather than hypothesis testing (Metzler, 1988, 1989 Westlake, 1988). The 90% confidence interval approach, using 1 —2a (where a = 0.05), should be applied to the individual parameters of interest (i.e. the pharmacokinetic terms that estimate the rate and extent of drug absorption) (Martinez Berson, 1998). Graphical presentation of the plasma concentrationtime curves for averaged data (test vs. reference product) can be misleading, as the curves may appear to be similar even for drug products that are not bioequivalent. [Pg.85]

Some definitions are contradictory, meaningless, without benefit or will cause much expenditure of personnel and measurement capacity, e.g. Limit of determination. This is the smallest analyte content for which the method has been vahdated with specific accuracy and precision . Apart from the fact that precision is included in the explanation of accuracy the definition manifests a fundamental inability to give a definition which is fit for practice. A useful definition of the detection and quantification limit is based on a statistical approach to the confidence hyperbola of a methods calibration curve, elaborated by the Deutsche Forschungsgemeinschaft [12]-... [Pg.161]

The presently used statistical method is the confidence interval approach. The main concern is to rule out the possibility that the test product is inferior to the comparator pharmaceutical product by more than the specified amount. Hence a one-sided confidence interval (for efficacy and/or safety) may be appropriate. The confidence intervals can be derived from either parametric or nonparametric methods. [Pg.377]

The normal and acceptable statistical approach for analyzing quantitative properties that change over time is to calculate the time it takes for the 95% one-sided confidence limit for the mean degradation curve to intersect the acceptable specification limit. If the data show that batch-to-batch variability is small, it may be worthwhile to combine the data into one overall estimate. This can be done by first applying the appropriate statistical tests to the slopes of the regression lines and zero time intercepts for the individual batches. If the data from the individual batches cannot be combined, the shortest time interval any batch remains within acceptable limits may determine the overall re-test period. [Pg.471]

The second and preferred method is to apply appropriate statistical analysis to the dataset, based on linear regression. Both EU and USFDA authorities assume log-linear decline of residue concentrations and apply least-squares regression to derive the fitted depletion line. Then the one-sided upper tolerance limit (95% in EU and 99% in USA) with a 95% confidence level is computed. The WhT is the time when this upper one-sided 95% tolerance limit for the residue is below the MRL with 95% confidence. In other words, this definition of the WhT says that at least 95% of the population in EU (or 99% in USA) is covered in an average of 95% of cases. It should be stressed that the nominal statistical risk that is fixed by regulatory authorities should be viewed as a statistical protection of farmers who actually observe the WhT and not a supplementary safety factor to protect the consumer even if consumers indirectly benefit from this rather conservative statistical approach. [Pg.92]

For milk, EMA/CVMP uses a time to safe concentration (TTSC) approach.TTSC is the first time when the upper 95% confidence limit of the 95th percentile of individual milk sampling times complies with the MRL. The method assumes a log normal distribution of individual times to safe concentration. If the data set is not suitable for analysis by the TTSC method, alternative statistical approaches may be used. Thus, the distributional assumptions of the USFDA/CVM and EMA/CVMP relate, respectively, to residue concentrations and time to safe concentration. An advantage of the latter approach is that an assumption of log linear depletion of residues is not made. [Pg.93]

Table VII, Raymond et al. ( ) analyzed up to 400 vitrinite grains for organic sulfur content both with and without the aid of photomosaics. Using a t-statistic approach they calculated the number of analyses (n) for each run necessary to give a desired maximum variability of lOJt, at the 95% confidence level, from the true mean as defined by 100 analyses. As can be seen in Table VII, in no case was it necessary to analyze more than 14 vitrinite areas. The second advantage to analyzing only vitrinite is that Raymond et al. ( .) were able to achieve essentially identical results both with and without the use of photomosaics. Using texture and morphology to identify areas of vitrinite after the sample had been placed in the EPM was as successful as identifying the vitrinite using oil-immersion microscopy prior to analysis. It should be noted that only two of the four... Table VII, Raymond et al. ( ) analyzed up to 400 vitrinite grains for organic sulfur content both with and without the aid of photomosaics. Using a t-statistic approach they calculated the number of analyses (n) for each run necessary to give a desired maximum variability of lOJt, at the 95% confidence level, from the true mean as defined by 100 analyses. As can be seen in Table VII, in no case was it necessary to analyze more than 14 vitrinite areas. The second advantage to analyzing only vitrinite is that Raymond et al. ( .) were able to achieve essentially identical results both with and without the use of photomosaics. Using texture and morphology to identify areas of vitrinite after the sample had been placed in the EPM was as successful as identifying the vitrinite using oil-immersion microscopy prior to analysis. It should be noted that only two of the four...
In the statistical approach, such as is possible with the CSD, a large number of observed interaction geometries cluster in a narrow region. A few outliers are deformed and may be repulsive in nature. The larger the number of data points, the greater is the confidence in the intermolecular contact or structural hypothesis under study. As for the outliers, they could furnish an additional bonus in that their occurrence is often indicative of an unusual or different chemical effect. [Pg.72]

The simplistic approach to uncertainty is to assign a value based on a physical characteristic of the instrument or measuring device. However, to assign an accurate value requires a statistical approach—standard deviation or confidence intervals, which are described in the next sections. [Pg.33]

The objective of durability testing is to be able to predict the performance of an actual bonded structure on exposure to normal service conditions, based on the observed behavior of test specimens in accelerated laboratory procedures. Except in a few special cases, this is not possible at present. A number of methods, based on reaction rate theory and statistical approaches, show some progress, but, in general, satisfactory predictive methods have yet to be developed. This is important for the future use of structural adhesives, since the ability to predict performance will increase confidence in the use of structural adhesives as a viable joining method. [Pg.403]

Once the reaction scheme is available, an appropriate rate equation for each step of the network must be found. For this purpose, a very effective sequential model discrimination technique is available, also based on a statistical approach. The basic idea is to design the next experiment at a level of the independent variables such that the expected difference between the responses y. of rival models /37/ or their confidence regions /38,39/ is a maximum (see Fig. 16). [Pg.85]


See other pages where Confidence statistic approaches is mentioned: [Pg.290]    [Pg.33]    [Pg.377]    [Pg.203]    [Pg.184]    [Pg.293]    [Pg.129]    [Pg.231]    [Pg.90]    [Pg.97]    [Pg.35]    [Pg.42]    [Pg.129]    [Pg.136]    [Pg.3496]    [Pg.78]    [Pg.363]    [Pg.188]    [Pg.106]    [Pg.43]    [Pg.54]    [Pg.35]    [Pg.326]    [Pg.855]    [Pg.469]    [Pg.38]    [Pg.678]    [Pg.195]   
See also in sourсe #XX -- [ Pg.29 ]




SEARCH



Confidence

Statistics confidence

© 2024 chempedia.info