Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

CONCEPTS Frequency Distributions

Figure 7.4 Typical frequency distribution of a population response to an equivalent dose of a biologically active agent. This type of response represents the variability that occurs within biological systems and is the basis for the concept of dose response in pharmacology and toxicology. This figure demonstrates that within any population, both hyporeactive and hyperreactive individuals can be expected to exist and must be addressed in a risk assessment. Figure 7.4 Typical frequency distribution of a population response to an equivalent dose of a biologically active agent. This type of response represents the variability that occurs within biological systems and is the basis for the concept of dose response in pharmacology and toxicology. This figure demonstrates that within any population, both hyporeactive and hyperreactive individuals can be expected to exist and must be addressed in a risk assessment.
Statistical concepts employed in setting specifications and their relationship to product quality control include accidental and systemic errors, frequency distributions, measures of dispersion, standard deviations, standard errors, and sampling plans. In summary, specifications must be set by taking into account... [Pg.412]

A graphical representation of the discrete distribution f(ji) is shown in Figure 1.3a. Figure 1.3b shows the analogous MWD represented as the (number) distribution ff of the discrete variable (notice the equivalence of the concept of distribution with those of fraction or frequency). [Pg.6]

At this point, it is useful to introduce the concept of a random sample. Let f(x) be the distribution function for some continuous random variable x and take multiple samples from this distribution each of size n. A property, such as composition, is measured on each of the n specimens from each of the multiple samples. The sampling is said to be random if the frequency distributions of the measured property from each set of n specimens are equal and, in fact, equal f(x). The normal distribution theorem states that if x is normally distributed with a mean px and variance and a random sample of size n is taken, then the average x is also normally distributed having a mean px and variance a /n. [Pg.217]

How are the properties of the population used Perhaps one of the most familiar concepts in statistics is the frequency distribution. A plot of a frequenter distribution is shown in Fig. 3.1, where the ordinate (y-axis) represents the number of occurrences of a particular value of a variable given by the scales of the abscissa (x-axis). If the data is discrete, usually but not necessarily measured on nominal or ordinal scales, then the... [Pg.50]

The obvious method of characterization of separation capability of air classifiers is by using the cut size concept. Ideally, all particles below the cut size would end up in the fines stream while all particles above the cut size would follow the coarse stream. However, there will be always misplaced material, that is, a small fraction of particles smaller than the cut size would be in the coarse stream and an equally small proportion of particles larger than the cut size would appear in the fines stream. The extent of the overlap due to misplaced material, as well as the cut size, can be determined by measuring the particle size distributions of both streams, and presenting their data as a weight frequency distribution. The yields of fines Yf and coarse Y streams need to be identified. When they are equal, the point of overlap gives the cut size. When they are not equal, which is most likely, the frequency distribution for the fines stream must be multiplied by the yield for the fines stream, while the yield for the coarse stream must be multiplied by the yield for the coarse stream. The cut size is, thus, given by the point of intersection of these curves. [Pg.342]

If these concepts are accepted, then it is only logical that they should be applied consistently in the interpretation of observed intake. If Fig. 1 is replotted as a cumulative frequency distribution, the new distribution takes the form of a probability or risk distribution— the probability that a particular level of intake is adequate or inadequate for a randomly selected individual of the class. This is portrayed in Fig. 2A. Going a step further, it should be apparent that requirement estimates do relate to individuals but that the only statement that can be made with regard to a particular individual is a probability statement. The assessment of observed intake, then, should be a probability assessment. [Pg.108]

A series of studies have evaluated the possible consequraces of transiait impairment of thyroid function during the neonatal period due to iodine deficiency. Studies conducted in areas with severe iodine deficiency have shown that there is a dramatic shift of the intellectual quotients (IQ) towards low values in schoolchildren bom to severely iodine deficient mothers. After the pioneer studies conducted in this field by the group of FIERRO-BENITEZ in Ecuador this concept has been confirmed by many others (Review in The same trend has also been illustrated in Europe BLEICHRODT et al. have showed that the frequency distribution of IQ in clinically euthyroid schoolchildren was also shifted towards low values in children from an iodine deficient area as compared to controls bom in the same type of rural villages but without iodine deficiency. This mental deficit was only partly corrected 32 months after the oral administration of iodized oil in the iodine deficient children. Consequently the deficit is due partly to hypothyroidism occurring in early life. Similar results were reported by others... [Pg.205]

Some industrial applications involve the concept outlined here. The basic idea is to test whether or not a group of obseiwations follows a preconceived distribution. In the case cited, the distribution is uniform i.e., each face value should tend to occur with the same frequency. [Pg.499]

Tliis part of tlie book reviews and develops quantitative metliods for tlie analysis of liazard conditions in terms of the frequency of occurrence of unfavorable consequences. Uncertainty characterizes not only Uie transformation of a liazard into an accident, disaster, or catastrophe, but also tlie effects of such a transformation. Measurement of uncertainty falls witliin tlie purview of matliematical probability. Accordingly, Chapter 19 presents fundamental concepts and Uieorems of probability used in risk assessment. Chapter 20 discusses special probability distributions and teclmiques pertinent to risk assessment, and Chapter 21 presents actual case studies illustrating teclmiques in liazard risk assessment tliat use probability concepts, tlieorems, and special distributions. [Pg.539]

Reactions in solution proceed in a similar manner, by elementary steps, to those in the gas phase. Many of the concepts, such as reaction coordinates and energy barriers, are the same. The two theories for elementary reactions have also been extended to liquid-phase reactions. The TST naturally extends to the liquid phase, since the transition state is treated as a thermodynamic entity. Features not present in gas-phase reactions, such as solvent effects and activity coefficients of ionic species in polar media, are treated as for stable species. Molecules in a liquid are in an almost constant state of collision so that the collision-based rate theories require modification to be used quantitatively. The energy distributions in the jostling motion in a liquid are similar to those in gas-phase collisions, but any reaction trajectory is modified by interaction with neighboring molecules. Furthermore, the frequency with which reaction partners approach each other is governed by diffusion rather than by random collisions, and, once together, multiple encounters between a reactant pair occur in this molecular traffic jam. This can modify the rate constants for individual reaction steps significantly. Thus, several aspects of reaction in a condensed phase differ from those in the gas phase ... [Pg.146]

According to the model, a perturbation at one site is transmitted to all the other sites, but the key point is that the propagation occurs via all the other molecules as a collective process as if all the molecules were connected by a network of springs. It can be seen that the model stresses the concept, already discussed above, that chemical processes at high pressure cannot be simply considered mono- or bimolecular processes. The response function X representing the collective excitations of molecules in the lattice may be viewed as an effective mechanical susceptibility of a reaction cavity subjected to the mechanical perturbation produced by a chemical reaction. It can be related to measurable properties such as elastic constants, phonon frequencies, and Debye-Waller factors and therefore can in principle be obtained from the knowledge of the crystal structure of the system of interest. A perturbation of chemical nature introduced at one site in the crystal (product molecules of a reactive process, ionized or excited host molecules, etc.) acts on all the surrounding molecules with a distribution of forces in the reaction cavity that can be described as a chemical pressure. [Pg.168]

Bias The systematic or persistent distortion of an estimate from the true value. From sampling theory, bias is a characteristic of the sample estimator of the sufficient statistics for the distribution of interest. Therefore, bias is not a function of the data, but of the method for estimating the population statistics. For example, the method for calculating the sample mean of a normal distribution is an unbiased estimator of the true but unknown population mean. Statistical bias is not a Bayesian concept, because Bayes theorem does not relay on the long-term frequency expections of sample estimators. [Pg.177]

Blass (1976a) and Blass and Halsey (1981) discuss data acquisition for a continuous scanning spectrometer in detail. The principal concept is that as a system scans a spectral line at some rate, the resulting time-varying signal will have a distribution of frequency components in the Fourier domain. [Pg.170]


See other pages where CONCEPTS Frequency Distributions is mentioned: [Pg.278]    [Pg.542]    [Pg.293]    [Pg.317]    [Pg.109]    [Pg.391]    [Pg.264]    [Pg.107]    [Pg.341]    [Pg.543]    [Pg.620]    [Pg.578]    [Pg.369]    [Pg.359]    [Pg.2775]    [Pg.14]    [Pg.408]    [Pg.399]    [Pg.53]    [Pg.198]    [Pg.524]    [Pg.366]    [Pg.165]    [Pg.170]    [Pg.28]    [Pg.78]    [Pg.190]    [Pg.21]    [Pg.226]    [Pg.53]    [Pg.133]    [Pg.145]    [Pg.48]    [Pg.251]    [Pg.285]    [Pg.5]    [Pg.112]   


SEARCH



Frequency distribution

© 2024 chempedia.info