Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Statistics nominal

Flavor Intensity. In most sensory tests, a person is asked to associate a name or a number with his perceptions of a substance he sniffed or tasted. The set from which these names or numbers are chosen is called a scale. The four general types of scales are nominal, ordinal, interval, and ratio (17). Each has different properties and allowable statistics (4,14). The measurement of flavor intensity, unlike the evaluation of quaUty, requires an ordered scale, the simplest of which is an ordinal scale. [Pg.2]

A process is in control when the average spread of variation coincides with the nominal specification for a parameter. The range of variation may extend outside the upper and lower limits but the proportion of parts within the limits can be predicted. This situation will remain as long as the process remains in statistical control. A process is in statistical... [Pg.367]

The same exponent was obtained in both works, although in one case a vaned-disk impeller was used, and in the other, arrowhead-shaped blades. Although the nominal residence time can be determined from a knowledge of gas holdup and volumetric gas flow rate, the nominal residence time (Eq. 36) will equal the statistical average residence time (Eq. 34) for certain cases only (L2). [Pg.314]

Nonstatistical decision criteria are the norm because specification limits are frequently preseribed (i.e., 95 to 105% of nominal) and the quality of previous deliveries or competitor s warranty raises expectations beyond what statistical common sense might suggest. [Pg.10]

The statistical interpretations are there is a 5% chance that the extrapolation is below 90% at f = 26 and there is a 5% chance that a further measurement at / = 26 months will yield a result below y 89% of nominal. Every batch in the stability program is subjected to this procedure the batch that yields the shortest shelf-life sets the expiration date. Possible solutions are as follows ... [Pg.247]

Here you can still use the Pearson chi-square test as shown in the 2x2 table example as long as your response variable is nominal and merely descriptive. If your response variable is ordinal, meaning that it is an ordered sequence, and you can use a parametric test, then you should use the Mantel-Haenszel test statistic for parametric tests of association. For instance, if in our previous example the variable called headache was coded as a 2 when the patient experienced extreme headache, a 1 if mild headache, and a 0 if no headache, then headache would be an ordinal variable. You can get the Mantel-Haenszel /pvalue by running the following SAS code ... [Pg.252]

Instead of nodal lines in closed systems we are interested in the statistics of NPs for open chaotic billiards since they form vortex centers and thereby shape the entire flow pattern (K.-F. Berggren et.al., 1999). Thus we will focus on nodal points and their spatial distributions and try to characterize chaos in terms of such distributions. The question we wish to ask is simply if one can find a distinct difference between the distributions for nominally regular and irregular billiards. The answer to this question is clearly positive as it is seen from fig. 3. As shown qualitatively NPs and saddles are both spaced less regularly in chaotic billiard in comparison to the integrable billiard. The mean density of NPs for a complex RGF (9) equals to k2/A-k (M.V. Berry et.al., 1986). This formula is satisfied with good accuracy in both chaotic and integrable billiards. [Pg.74]

There is a hierarchy of usefulness of data, according to how well it can be statistically manipulated. The accepted order is continuous data > ordinal data > nominal data. [Pg.201]

The desorption flux is so low under these conditions that no gas phase collisions occurred between molecular desorption and LIF probing. Phase space treatments " of final-state distributions for dissociation processes where exit channel barriers do not complicate the ensuing dynamics often result in nominally thermal distributions. In the phase space treatment a loose transition state is assumed (e.g. one resembling the products) and the conserved quantities are total energy and angular momentum the probability of forming a particular flnal state of ( , J) is obtained by analyzing the number of ways to statistically distribute the available (E, J). [Pg.53]

Table I indicates good agreement between the molecular weight distribution statistics obtained by coupled GPC/Viscometer method and the nominal values for t BS 706. The discrepancy between the Mark-Houwink parameters obtained here and the reported values for polystyrene standard ( ) in THF at 25°C (i.e., a = 0,706 and k = 1.60 x 10 ) may in part be due to the uncertainty involved in the determination of the dead volume between DRI and viscometer detectors. Our simulation studies over a range of dead volume values (0 to 120u)l) showed that a and k are quite sensitive to the dead volume between the detectors. Larger dead volume results in smaller o and larger k values. This is a direct result of a clockwise rotation of log [q] vs. log M(v) curve (Figure 12) which occurs when the dead volume correction is applied in quantitative analysis. The effect on the molecular weight statistics, however, appeared to... Table I indicates good agreement between the molecular weight distribution statistics obtained by coupled GPC/Viscometer method and the nominal values for t BS 706. The discrepancy between the Mark-Houwink parameters obtained here and the reported values for polystyrene standard ( ) in THF at 25°C (i.e., a = 0,706 and k = 1.60 x 10 ) may in part be due to the uncertainty involved in the determination of the dead volume between DRI and viscometer detectors. Our simulation studies over a range of dead volume values (0 to 120u)l) showed that a and k are quite sensitive to the dead volume between the detectors. Larger dead volume results in smaller o and larger k values. This is a direct result of a clockwise rotation of log [q] vs. log M(v) curve (Figure 12) which occurs when the dead volume correction is applied in quantitative analysis. The effect on the molecular weight statistics, however, appeared to...
Molecular connectivity indices are desirable as potential explanatory variables because they can be calculated for a nominal cost (fractions of a second by computer) and they describe fundamental relationships about chemical structure. That Is, they describe how non-hydrogen atoms of a molecule are "connected". Here we are most concerned with the statistical properties of molecular connectivity Indices for a large set of chemicals In TSCA and the presentation of the results of multivariate analyses using these Indices as explanatory variables to understand several properties important to environmental chemists. We will focus on two properties for which we have a relatively large data base (1) biodegradation as measured by the percentage of theoretical 5-day biochemical oxygen demand (B0D)( 11), and (2) n-octanol/water partition coefficient or hereafter termed log P (12). [Pg.149]

In the traditional interpretation of the Fangevin equation for a constrained system, the overall drift velocity is insensitive to the presence or absence of hard components of the random forces, since these components are instantaneously canceled in the underlying ODF by constraint forces. This insensitivity to the presence of hard forces is obtained, however, only if both the projected divergence of the mobility and the force bias are retained in the expression for the drift velocity. The drift velocity for a kinetic interpretation of a constrained Langevin equation does not contain a force bias, and does depend on statistical properties of the hard random force components. Both Fixman and Hinch nominally considered the traditional interpretation of the Langevin equation for the Cartesian bead coordinates as a limit of an ordinary differential equation. Both authors, however, neglected the possible existence of a bias in the Cartesian random forces. As a result, both obtained a drift velocity that (after correcting the error in Fixman s expression for the pseudoforce) is actually the appropriate expression for a kinetic interpretation. [Pg.151]

Polyvinyl Chloride. The results obtained for a polyvinyl chloride sample are listed in Table 5. It is seen that the measured molecular weight statistics are in reasonable agreement with the nominal values. The Mark-Houwink parameters K and a obtained from the linear plot of log [nl vs. log M are in good agreement with one group of literature values (41-43) while the a value is lower than that of another group. (3,44-46)... [Pg.145]

The univariate response data on all standard biomarker data were analysed, ineluding analysis of variance for unbalaneed design, using Genstat v7.1 statistical software (VSN, 2003). In addition, a-priori pairwise t-tests were performed with the mean reference value, using the pooled variance estimate from the ANOVA. The real value data were not transformed. The average values for the KMBA and WOP biomarkers were not based on different flounder eaptured at the sites, but on replicate measurements of pooled liver tissue. The nominal response data of the immunohistochemical biomarkers (elassification of effects) were analysed by means of a Monte... [Pg.14]

For many practical purposes, as well as for some theoretical purposes involving statistical thermodynamics, it is expedient to deal with the volume concentration, denoted by c or number density, denoted by p, that is, the number of moles, or molecules of the solute per unit volume of the solution. In dilute solutions the density is usually linear with the concentration, tending to the limiting value of that of the solvent at infinite dilution. There are different numbers of moles of solvent per unit volume in different solvents and at given molarities c or number densities p also per mole of solute. For typical solvents, there are from 55.5 mol for water down to 3.3 mol for hexadecane in 1 dml This constitutes nominally a solvent effect that ought not to be neglected. [Pg.77]

Quantitative methodology uses large or relatively large samples of subjects (as a rule students) and tests or questionnaires to which the subjects answer. Results are treated by statistical analysis, by means of a variety of parametric methods (when we have continuous data at the interval or at the ratio scale) or nonparametric methods (when we have categorical data at the nominal or at the ordinal scale) (30). Data are usually treated by standard commercial statistical packages. Tests and questionnaires have to satisfy the criteria for content and construct validity (this is analogous to lack of systematic errors in measurement), and for reliability (this controls for random errors) (31). [Pg.79]

Even with powerful computer programs at hand, the solution of estimation problems is usually far from simple. A convenient way to eliminate computational errors and to study the effects of statistical assumptions is to solve first a problem with known true parameter values, involving data generated at some nominal parameter vector. Initially it is advisable to investigate with error-free data, then to add errors of the assumed structure. The simulation usually requires normally distributed random variables. Random numbers R that approximately are from a normal distribution with zero mean and unit variance can be obtained by... [Pg.144]

Fig. 3.6. Illustration of use of benchmark dose method to estimate nominal thresholds for deterministic effects in humans. The benchmark dose (EDio) and LEDi0 are central estimate and lower confidence limit of dose corresponding to 10 percent increase in response, respectively, obtained from statistical fit of dose-response model to dose-response data. The nominal threshold in humans could be set at a factor of 10 or 100 below LED10, depending on whether the data are obtained in humans or animals (see text for description of projected linear dose below point of departure). Fig. 3.6. Illustration of use of benchmark dose method to estimate nominal thresholds for deterministic effects in humans. The benchmark dose (EDio) and LEDi0 are central estimate and lower confidence limit of dose corresponding to 10 percent increase in response, respectively, obtained from statistical fit of dose-response model to dose-response data. The nominal threshold in humans could be set at a factor of 10 or 100 below LED10, depending on whether the data are obtained in humans or animals (see text for description of projected linear dose below point of departure).

See other pages where Statistics nominal is mentioned: [Pg.28]    [Pg.28]    [Pg.307]    [Pg.54]    [Pg.713]    [Pg.78]    [Pg.929]    [Pg.118]    [Pg.85]    [Pg.118]    [Pg.165]    [Pg.225]    [Pg.428]    [Pg.202]    [Pg.1020]    [Pg.16]    [Pg.217]    [Pg.201]    [Pg.276]    [Pg.282]    [Pg.313]    [Pg.258]    [Pg.133]    [Pg.71]    [Pg.257]    [Pg.258]    [Pg.45]    [Pg.232]    [Pg.196]    [Pg.51]    [Pg.256]    [Pg.264]    [Pg.241]    [Pg.11]   
See also in sourсe #XX -- [ Pg.200 ]




SEARCH



Nominal

Nominalizations

© 2024 chempedia.info