Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

General Statistical Approaches

Where applicable, an appropriate statistical method should be employed to analyze the long-term primary stability data in an original application. The purpose of this analysis is to establish, with a high degree of confidence, a retest period or shelf life during which a quantitative attribute will remain within acceptance criteria for all future batches manufactured, packaged, and stored under similar circumstances. This same method could also be applied to commitment batches to verify or extend the originally approved retest period or shelf life. [Pg.72]

Regression analysis is considered an appropriate approach to evaluating the stability data for a quantitative attribute and establishing a retest period or shelf life. The nature of the relationship between an attribute and time will determine whether data should be transformed for linear regression analysis. Usually the relationship can be represented by a linear or nonlinear function on an arithmetic or logarithmic scale. Sometimes a nonlinear regression can be expected to better reflect the hue relationship. [Pg.72]

An appropriate approach to retest period or shelf-life estimation is to analyze a quantitative attribute by determining the earliest time at which the 95% confidence limit for the mean around the regression curve intersects the proposed acceptance criterion. [Pg.72]

For an attribute known to decrease with time, the lower one-sided 95% confidence limit should be compared with the acceptance criterion. For an attribute known to increase with time, the upper one-sided 95% confidence limit should be compared with the criterion. For an attribute that can either increase or decrease, or whose direction of change is not known, two-sided 95% confidence limits should be calculated and compared with the upper and lower acceptance criteria. [Pg.72]

The statistical method used for data analysis should take into account the stability study design to provide a valid statistical inference for the estimated retest period or shelf life. The approach described above can be used to estimate the retest period or shelf life for a single batch or for multiple batches when combined after an appropriate statistical test. [Pg.72]


In the rough experimental space the distance between discrete levels of the experimental variables is relatively large. After testing three - four catalyst generations different Data Processing methods, such as general statistical approaches or Artificial Neural Networks (ANNs), can be applied to determine the contribution of each variable into the overall performance or establish the Activity - Composition Relationship (ACR). [Pg.304]

As can be seen from Figure 4, LBVs for these components are not constant across the ranges of composition. An iateraction model has been proposed (60) which assumes that the lack of linearity results from the iateraction of pairs of components. An approach which focuses on the difference between the weighted linear average of the components and the actual octane number of the blend (bonus or debit) has also been developed (61). The iadependent variables ia this type of model are statistical functions (averages, variances, etc) of blend properties such as octane, olefins, aromatics, and sulfur. The general statistical problem has been analyzed (62) and the two approaches have been shown to be theoretically similar though computationally different. [Pg.188]

In general, tolerance stack models are based on either the wor.st case or statistical approaches, including those given in the references above. The worst case model (see equation 3.1) assumes that each component dimension is at its maximum or minimum limit and that the sum of these equals the assembly tolerance (initially this model was presented in Chapter 2). The tolerance stack equations are given in terms of bilateral tolerances on each component dimension, which is a common format when analysing tolerances in practice. The worst case model is ... [Pg.113]

The other class of phenomenological approaches subsumes the random surface theories (Sec. B). These reduce the system to a set of internal surfaces, supposedly filled with amphiphiles, which can be described by an effective interface Hamiltonian. The internal surfaces represent either bilayers or monolayers—bilayers in binary amphiphile—water mixtures, and monolayers in ternary mixtures, where the monolayers are assumed to separate oil domains from water domains. Random surface theories have been formulated on lattices and in the continuum. In the latter case, they are an interesting application of the membrane theories which are studied in many areas of physics, from general statistical field theory to elementary particle physics [26]. Random surface theories for amphiphilic systems have been used to calculate shapes and distributions of vesicles, and phase transitions [27-31]. [Pg.639]

One disadvantage of statistical approaches is that they rely on two of the assumptions stated in the introduction, namely, that reactions follow the minimum energy path to each product channel, and that the reactive flux passes through a transition state. Several examples in Section V violate one or both of these assumptions, and hence statistical methods generally cannot treat these instances of competing pathways [33]. [Pg.226]

In general, each form of ideal flow can be characterized exactly mathematically, as can the consequences of its occurrence in a chemical reactor (some of these are explored in Chapter 2). This is in contrast to nonideal flow, a feature which presents one of the major difficulties in assessing the design and performance of actual reactors, particularly in scale-up from small experimental reactors. This assessment, however, may be helped by statistical approaches, such as provided by residence-time distributions. It... [Pg.317]

Because of the heterogeneity of soils in general and helds in particular, statistics alone cannot provide the best guidance regarding sampling for all situations. Instead some random samples can be taken and the results used in combination with statistical approaches to guide the selection of additional samphng sites. [Pg.153]

The assessment and quantification of the remaining reserves and resources of fossil fuels is a very complex and broad field, characterised by a lack of internationally harmonised definitions and standards, great data uncertainties and discrepancies and, consequently, the potential danger of data abuse for political purposes. Within the scope of this publication, only an overview of the range of the currently available estimates of fossil resources is provided and the focus is rather on the general discussion of potential sources of uncertainty, than on a detailed assessment of the different methodological and statistical approaches and discrepancies at country or even field level. [Pg.52]

The gel point is usually determined experimentally as that point in the reaction at which the reacting mixture loses fluidity as indicated by the failure of bubbles to rise in it. Experimental observations of the gel point in a number of systems have confirmed the general utility of the Carothers and statistical approaches. Thus in the reactions of glycerol (a triol) with equivalent amounts of several diacids, the gel point was observed at an extent of reaction of 0.765 [Kienle and Petke, 1940, 1941], The predicted values of pc, are 0.709 and 0.833 from Eqs. 148 (statistical) and 2-139 (Carothers), respectively. Flory [1941] studied several systems composed of diethylene glycol (/ = 2), 1,2,3-propanetricarboxylic acid (/ = 3), and either succinic or adipic acid (/ = 2) with both stoichiometric and nonstoichiometric amounts of hydroxyl and carboxyl groups. Some of the experimentally observed pc values are shown in Table 2-9 along with the corresponding theoretical values calculated by both the Carothers and statistical equations. [Pg.111]

The use of a new statistical approach, the limiting distribution, which is based upon the assumption of lognormality, generally leads to correct conclusions on the basis of few samples relative to the number required by tolerance sets. Limiting distributions may also be indexed to the toxicological and environmental realities so that they can be made as conservative as desired. [Pg.452]

Statistical approach. The linearity results can be subjected to statistical analysis (e.g., use of statistical analysis in an Excel spreadsheet). The p-value of the -intercept can be used to determine if the intercept is statistically significant. In general, when the p-value is less than 0.05, the... [Pg.39]

Although the statistical approach to the derivation of thermodynamic functions is fairly general, we shall restrict ourselves to a) crystals with isolated defects that do not interact (which normally means that defect concentrations are sufficiently small) and b) crystals with more complex but still isolated defects (i.e., defect pairs, associates, clusters). We shall also restrict ourselves to systems at some given (P T), so that the appropriate thermodynamic energy function is the Gibbs energy, G, which is then constructed as... [Pg.28]

One can derive Eqn. (12,12) in a more fundamental way by starting the statistical approach with the (Markovian) master equation, assuming that the jump probabilities obey Boltzmann statistics on the activation saddle points. Salje [E. Salje (1988)] has discussed the following general form of a kinetic equation for solid state processes... [Pg.301]

Good et al. (15,30,140) have also considered these problems and extended Exner s approach in a reassessment of the validity of previously reported relationships between the energies and entropies of activation for the fluidity of certain aqueous solutions. These workers show, for the particular systems considered, that the relationship is nonlinear. Good et al. (30,140) emphasize the necessity, in the general statistical analysis, of distinguishing between two alternative objectives, which are... [Pg.268]

The micellization of surfactants has been described as a single kinetic equilibrium (10) or as a phase separation (11). A general statistical mechanical treatment (12) showed the similarities of the two approaches. Multiple kinetic equilibria (13) or the small system thermodynamics by Hill (14) have been frequently applied in the thermodynamics of micellization (15, 16, 17). Even the experimental determination of the factors governing the aggregation conditions of micellization in water is still a matter of considerable interest (18, 19) and dispute (20). [Pg.37]

Summary. Two principal methods for removal of low frequency noise transients are currently available. The model-based separation approach has shown more flexibility and generality, but is computationally rather intensive. It is felt that future work in the area should consider the problem from a realistic physical modelling perspective, which takes into account linear and non-linear characteristics of gramophone and film sound playback systems, in order to detect and correct these artifacts more effectively. Such an approach could involve both experimental work with playback systems and sophisticated non-linear modelling techniques. Statistical approaches related to those outlined in the click removal work (section 4.3.4) may be applicable to this latter task. [Pg.96]

By integrative analysis of omics data, we understand the algorithmic and statistical approaches that pursue the combination of different omics data in one analysis. We can identify two major general strategies (24,54) ... [Pg.446]

Clinical trials employ more than one subject in each treatment group. The reason for this is that there is considerable variation in how individuals respond to the administration of the same drug. It is therefore simply not possible to choose only one subject to receive the new drug and another to receive the placebo There is no way of knowing how representative the subject s response to the drug is of the typical response of people in general. The need to include more than one person is a driving force behind all of the statistical approaches discussed. [Pg.66]


See other pages where General Statistical Approaches is mentioned: [Pg.72]    [Pg.71]    [Pg.72]    [Pg.71]    [Pg.262]    [Pg.571]    [Pg.173]    [Pg.175]    [Pg.324]    [Pg.490]    [Pg.113]    [Pg.31]    [Pg.377]    [Pg.76]    [Pg.12]    [Pg.10]    [Pg.200]    [Pg.167]    [Pg.207]    [Pg.213]    [Pg.364]    [Pg.199]    [Pg.31]    [Pg.631]    [Pg.820]    [Pg.215]    [Pg.137]    [Pg.689]    [Pg.138]    [Pg.219]    [Pg.418]    [Pg.431]    [Pg.178]   


SEARCH



General Approach

© 2024 chempedia.info