Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Sampling, probability concepts

The relative error is the absolute error divided by the true value it is usually expressed in terms of percentage or in parts per thousand. The true or absolute value of a quantity cannot be established experimentally, so that the observed result must be compared with the most probable value. With pure substances the quantity will ultimately depend upon the relative atomic mass of the constituent elements. Determinations of the relative atomic mass have been made with the utmost care, and the accuracy obtained usually far exceeds that attained in ordinary quantitative analysis the analyst must accordingly accept their reliability. With natural or industrial products, we must accept provisionally the results obtained by analysts of repute using carefully tested methods. If several analysts determine the same constituent in the same sample by different methods, the most probable value, which is usually the average, can be deduced from their results. In both cases, the establishment of the most probable value involves the application of statistical methods and the concept of precision. [Pg.134]

The percolation theory [5, 20-23] is the most adequate for the description of an abstract model of the CPCM. As the majority of polymers are typical insulators, the probability of transfer of current carriers between two conductive points isolated from each other by an interlayer of the polymer decreases exponentially with the growth of gap lg (the tunnel effect) and is other than zero only for lg < 100 A. For this reason, the transfer of current through macroscopic (compared to the sample size) distances can be effected via the contacting-particles chains. Calculation of the probability of the formation of such chains is the subject of the percolation theory. It should be noted that the concept of contact is not just for the particles in direct contact with each other but, apparently, implies convergence of the particles to distances at which the probability of transfer of current carriers between them becomes other than zero. [Pg.129]

A central concept of statistical analysis is variance,105 which is simply the average squared difference of deviations from the mean, or the square of the standard deviation. Since the analyst can only take a limited number n of samples, the variance is estimated as the squared difference of deviations from the mean, divided by n - 1. Analysis of variance asks the question whether groups of samples are drawn from the same overall population or from different populations.105 The simplest example of analysis of variance is the F-test (and the closely related t-test) in which one takes the ratio of two variances and compares the result with tabular values to decide whether it is probable that the two samples came from the same population. Linear regression is also a form of analysis of variance, since one is asking the question whether the variance around the mean is equivalent to the variance around the least squares fit. [Pg.34]

Another important advantage of sample truncation is that it can reduce nuisance correlations. The concept of nuisance correlation is explained later in this section, but we can also discuss this concept here in terms of taxonic and continuous variance. Consider that scores on real world indicators probably reflect both taxonic and continuous variance, and the presence of the latter can skew the results. Sample truncation essentially removes a big chunk of continuous variance, which can improve the accuracy of the results. It is, of course, important to ensure that taxonic variance stays intact. Almost any screener would screen out some taxon members, but our experiences with sample truncation suggest that it is usually possible to find a cutoff that removes a substantial number of nontaxon members without losing many taxon members. [Pg.41]

As soon as observations are considered as samples of random variables, we must redefine the concepts of distance and projection. Let us consider in three-dimensional space a vector y of one observation of three random variables Yj, Y2, and Y3 with its density of probability function fy. The statistical distance c of the vector. p to another point y can be defined by the non-negative scalar c2, which has already been met a few times, e.g., in equations (5.2.1) and (5.3.7), and such that... [Pg.284]

The classical, frequentist approach in statistics requires the concept of the sampling distribution of an estimator. In classical statistics, a data set is commonly treated as a random sample from a population. Of course, in some situations the data actually have been collected according to a probability-sampling scheme. Whether that is the case or not, processes generating the data will be snbject to stochastic-ity and variation, which is a sonrce of uncertainty in nse of the data. Therefore, sampling concepts may be invoked in order to provide a model that accounts for the random processes, and that will lead to confidence intervals or standard errors. The population may or may not be conceived as a finite set of individnals. In some situations, such as when forecasting a fnture value, a continuous probability distribution plays the role of the popnlation. [Pg.37]

Unpredictable Peak Exposures. When periods of peak exposure cannot be predicted, the concept of an absolute ceiling limit becomes naive because for any lognormal distribution of sample values (Og > 1) there is always a probability (however remote) that the limit will be exceeded. It is suggested, therefore, that in these situations acute-exposure limits be interpreted as air concentrations which should be exceeded only rarely, perhaps 5% of the time. This does not imply that it would be acceptable to... [Pg.442]

The dielectric relaxation of bulk mixtures of poly(2jS-di-methylphenylene oxide) and atactic polystyrene has been measured as a function of sample composition, frequency, and temperature. The results are compared with earlier dynamic mechanical and (differential scanning) calorimetric studies of the same samples. It is concluded that the polymers are miscible but probably not at a segmental level. A detailed analysis suggests that the particular samples investigated may be considered in terms of a continuous phase-dispersed phase concept, in which the former is a PS-rich and the latter a PPO-rich material, except for the sample containing 75% PPO-25% PS in which the converse is postulated. [Pg.42]

With the availability of some 50 sets of handblanks (environmental-natural levels of Ba and Sb on hands), firing tests and calibrations, we considered a different concept for the interpretation of the results. The evaluation consisted of two steps 1) establishing that the Ba and Sb values of handblanks of the accumulated population sample followed a normal (Gaussian) distribution as statistically approximated by the t-Distribution, and 2) utilization of relatively simple statistical formalism for the calculation of the probability that the amount of Ba and Sb found on a given swab belongs to the established handblank population. (An appendix at the end of the paper may be useful to readers not normally utilizing statistics). [Pg.89]

There is no a priori reason to doubt that the Central Limit Theorem, and consequently the normal distribution concept, applies to trace element distribution, including Sb and Ba on hands in a human population, because these concentrations are affected by such random variables as location, diet, metabolism, and so on. However, since enough data were at hand (some 120 samples per element), it was of interest to test the normal distribution experimentally by examination of the t-Distribution. The probability density plots of 0.2 and 3 ng increments for Sb and Ba, respectively, had similar appearances. The actual distribution test was carried out for Sb only because of better data due to the more convenient half life of 122Sb. After normalization, a "one tail" test was carried out. [Pg.91]


See other pages where Sampling, probability concepts is mentioned: [Pg.8]    [Pg.216]    [Pg.225]    [Pg.541]    [Pg.367]    [Pg.48]    [Pg.202]    [Pg.445]    [Pg.75]    [Pg.36]    [Pg.133]    [Pg.120]    [Pg.274]    [Pg.51]    [Pg.318]    [Pg.290]    [Pg.69]    [Pg.63]    [Pg.71]    [Pg.63]    [Pg.322]    [Pg.371]    [Pg.163]    [Pg.459]    [Pg.8]    [Pg.95]    [Pg.332]    [Pg.12]    [Pg.389]    [Pg.239]    [Pg.5]    [Pg.145]    [Pg.106]    [Pg.380]    [Pg.381]    [Pg.540]    [Pg.490]    [Pg.693]    [Pg.123]   
See also in sourсe #XX -- [ Pg.8 ]




SEARCH



Probability concept

Sampling concept

© 2024 chempedia.info