Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Formal variability analysis

Keywords Product variety Mass customisation Product configuration Complexity management Variability management Formal variability analysis... [Pg.491]

A formal mathematical analysis of the flow in the concentric cylinder viscometer yields the following relationship between the experimental variables and the viscosity ... [Pg.81]

The ability of any experimental method to produce accurate and reproducible results and provide the sensitivity needed to discern differences between transport mechanisms depends on minimizing variability intrinsic to the method. However, formal error analysis is rarely undertaken, even for commonly used methods. Fawcett and Caton [45] performed an error analysis of the capillary method for determining diffusion coefficients more than 25 years after the method was introduced. The value of the analysis is that it reveals which factors contribute the greatest variability to the dependent variable of interest. In the case of transport studies, the dependent variable of primary interest is diffusant concentration, C(t), where... [Pg.119]

Where the data show so little degradation and so little variability that it is apparent from looking at the data that the requested shelf life will be granted, it is normally unnecessary to go through the formal statistical analysis providing a justification for the omission should be sufficient. [Pg.14]

Stability data (not only assay but also degradation products and other attributes as appropriate) should be evaluated using generally accepted statistical methods. The time at which the 95% one-sided confidence limit intersects the acceptable specification limit is usually determined. If statistical tests on the slopes of the regression lines and the zero-time intercepts for the individual batches show that batch-to-batch variability is small (e.g., p values for the level of significance of rejection are more than 0.25), data may be combined into one overall estimate. If the data show very little degradation and variability and it is apparent from visual inspection that the proposed expiration dating eriod will be met, formal statistical analysis may not be necessary. [Pg.203]

Although the concept of patient variability had been articulated by the middle of the twentieth century, the concept that a difference between two groups could be due to chance was slow to be accepted. The first clinical trial to use a formal statistical analysis reportedly occurred in 1962. The study involved a comparison of antibody production after yellow fever vaccination by two different methods. Several years later (1966) a critique of statistical methods used in medical journal manuscripts suggested a lack of proper study design and data analysis. In this critique, the authors canonized the criterion of P < 0.05 for a difference between two groups to be considered not due to chance. [Pg.307]

In a simple sensitivity analysis, each parameter is varied individually, and the output is a qualitative understanding of which parameters have the most impact on project viability. In a more formal risk analysis, statistical methods are used to examine the effect of variation in all of the parameters simultaneously and hence quantitatively determine the range of variability in the economic criteria. This allows the design engineer to estimate the degree of confidence with which the chosen economic criterion can be said to exceed a given threshold. [Pg.381]

Studies that show virtually no degradation or variability will usually not require any formal statistical analysis. Under these circumstances the requested re-test period is normally granted when a complete justification for not needing a statistical analysis is part of the submission. The analysis should include all appropriate properties of the drug substance. [Pg.470]

At the data analysis preview it is common to update the prespecifled analysis and issue a formal statistical analysis plan. The plan gives more detail on the analysis to be conducted than can usefully be included in the protocol, e.g. precise descriptions of how variables derived from the source data are to be calculated, A key component is a full specification of the tables, listings and graphical presentations to be produced. The analysis plan is discussed and agreed by the study team. [Pg.61]

Q1A(R2) indicates that data evaluation must be done for submission batches. ICH guideline QIE provides more details on this topic and is discussed further in Chapter 13. The guidelines also emphasize that no formal statistical analysis is needed if data show little degradation or little variability. A justification of omission is needed to show that the data set remain within method variability and show no particular trend through time. [Pg.33]

To a large extent, the formal kinetic analysis techniques presented in this chapter relate to discontinuous batch operations. Even if the goal is a continuous operation, the batch process kinetic model serves as a start-up. The most significant element of a kinetic analysis is the time dependence of the macroscopic process variables mentioned in Chap. 2. Bacteria, molds, viruses, and yeasts all have different reproduction mechanisms, and formulating a structured kinetic model more closely related to the actual mechanism is a desirable goal. More structured models are desirable not only to deal with active cells but also to extend kinetic analysis to more complex situations involving inactive cells, mixed populations of cells, multiple substrates, and... [Pg.197]

Missing formal variability experts able to improve formal analysis algorithms on a problem/product/enterprise based way,... [Pg.495]

In multivariate least squares analysis, the dependent variable is a function of two or more independent variables. Because matrices are so conveniently handled by computer and because the mathematical formalism is simpler, multivariate analysis will be developed as a topic in matrix algebra rather than conventional algebra. [Pg.80]

Classification is both a basic concept and a collection of techniques which are necessary prerequisites for further analysis of data when the members of a set of data are (or can be) each described by several variables. At least some degree of classification (which is broadly defined as the dividing of the members of a group into smaller groups in accordance with a set of decision rules) is necessary prior to any data collection. Whether formally or informally, an investigator has to decide... [Pg.941]

An essential concept in multivariate data analysis is the mathematical combination of several variables into a new variable that has a certain desired property (Figure 2.14). In chemometrics such a new variable is often called a latent variable, other names are component or factor. A latent variable can be defined as a formal combination (a mathematical function or a more general algorithm) of the variables a latent variable summarizes the variables in an appropriate way to obtain acertain property. The value of a latent variable is called score. Most often linear latent variables are used given by... [Pg.64]

A simple modification of the IAM model, referred to as the K-formalism, makes it possible to allow for charge transfer between atoms. By separating the scattering of the valence electrons from that of the inner shells, it becomes possible to adjust the population and radial dependence of the valence shell. In practice, two charge-density variables, P , the valence shell population parameter, and k, a parameter which allows expansion and contraction of the valence shell, are added to the conventional parameters of structure analysis (Coppens et al. 1979). For consistency, Pv and k must be introduced simultaneously, as a change in the number of electrons affects the electron-electron repulsions, and therefore the radial dependence of the electron distribution (Coulson 1961). [Pg.55]

As already mentioned, any multivariate analysis should include some validation, that is, formal testing, to extrapolate the model to new but similar data. This requires two separate steps in the computation of each model component calibration, which consists of finding the new components, and validation, which checks how well the computed components describe the new data. Each of these two steps needs its own set of samples calibration samples or training samples, and validation samples or test samples. Computation of spectroscopic data PCs is based solely on optic data. There is no explicit or formal relationship between PCs and the composition of the samples in the sets from which the spectra were measured. In addition, PCs are considered superior to the original spectral data produced directly by the NIR instrument. Since the first few PCs are stripped of noise, they represent the real variation of the spectra, presumably caused by physical or chemical phenomena. For these reasons PCs are considered as latent variables as opposed to the direct variables actually measured. [Pg.396]

Physicists are familiar with many special functions that arise over and over again in solutions to various problems. The analysis of problems with spherical symmetry in P often appeal to the spherical harmonic functions, often called simply spherical harmonics. Spherical harmonics are the restrictions of homogeneous harmonic polynomials of three variables to the sphere S. In this section we will give a typical physics-style introduction to spherical harmonics. Here we state, but do not prove, their relationship to homogeneous harmonic polynomials a formal statement and proof are given Proposition A. 2 of Appendix A. [Pg.27]

It appears that the formal theories are not sufficiently sensitive to structure to be of much help in dealing with linear viscoelastic response Williams analysis is the most complete theory available, and yet even here a dimensional analysis is required to find a form for the pair correlation function. Moreover, molecular weight dependence in the resulting viscosity expression [Eq. (6.11)] is much too weak to represent behavior even at moderate concentrations. Williams suggests that the combination of variables in Eq. (6.11) may furnish theoretical support correlations of the form tj0 = f c rjj) at moderate concentrations (cf. Section 5). However the weakness of the predicted dependence compared to experiment and the somewhat arbitrary nature of the dimensional analysis makes the suggestion rather questionable. [Pg.76]

Spinel formation is usually treated under some tacit assumptions which do not always hold. For example, it is tacitly assumed that the oxygen potential of the surrounding gas atmosphere prevails throughout the reaction product during reaction. In other words, it is assumed that d,u0 = 0. Although this inference reduces the number of variables by one and simplifies the formal treatment, the subsequent analysis will show that the assumption is normally not adequate. [Pg.147]


See other pages where Formal variability analysis is mentioned: [Pg.9]    [Pg.34]    [Pg.426]    [Pg.327]    [Pg.120]    [Pg.20]    [Pg.715]    [Pg.586]    [Pg.73]    [Pg.327]    [Pg.139]    [Pg.69]    [Pg.28]    [Pg.208]    [Pg.5]    [Pg.302]    [Pg.943]    [Pg.168]    [Pg.203]    [Pg.49]    [Pg.117]    [Pg.422]    [Pg.109]    [Pg.39]    [Pg.83]   
See also in sourсe #XX -- [ Pg.495 , Pg.506 ]




SEARCH



Formal analysis

Variables analysis

© 2024 chempedia.info