Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Location measures statistical

A very robust location measure is the median. For an odd number of values, the median is the middle-order statistic. It lies at the position (n-Fl)/2. For an even number of measurements, the median is calculated from the average of the ( /2)th and ( /2 + l)th order statistics. The median is not dependent on outliers, and this is an advantage compared to the arithmetic mean. [Pg.22]

Statistical Process Control. A properly miming production process is characterized by the random variation of the process parameters for a series of lots or measurements. The SPG approach is a statistical technique used to monitor variation in a process. If the variation is not random, action is taken to locate and eliminate the cause of the lack of randomness, returning the process or measurement to a state of statistical control, ie, of exhibiting only random variation. [Pg.366]

Although energy resolution is rarely employed in positron camera systems, scatter is not normally a problem. This is because of the very short time window within which two photons must arrive in order to be counted. At low decay rates, the incidence of accidental events is very low, rising only slightly for those that occur as the result of scatter. Some systems employ time-of-flight measurements of the time difference between the arrival of the two photons to obtain additional information about the location of an annihilation along the line. This has been used to improve resolution and statistical accuracy. Resolution is in the range of 3—4 mm and is less dependent on position than is SPECT (16). [Pg.482]

In general, air quality data are classified as a function of time, location, and magnitude. Several statistical parameters may be used to characterize a group of air pollution concentrations, including the arithmetic mean, the median, and the geometric mean. These parameters may be determined over averaging times of up to 1 year. In addition to these three parameters, a measure of the variability of a data set, such as the standard deviation... [Pg.226]

Because X-ray counting rates are relatively low, it typically requires 100 seconds or more to accumulate adequate counting statistics for a quantitative analysis. As a result, the usual strategy in applying electron probe microanalysis is to make quantitative measurements at a limited collection of points. Specific analysis locations are selected with the aid of a rapid imaging technique, such as an SEM image prepared with backscattered electrons, which are sensitive to compositional variations, or with the associated optical microscope. [Pg.187]

With respect to sampling, sufficient numbers of environmental samples should be obtained to permit reliable statistical and biologic Interpretation of results. At the same time, the samples collected should be from environmental locations where human exposure Is most likely to occur (or did occur. If questions of past exposures require assessment). They should also be targeted for those environmental media which can be expected to have the greatest potential for human exposure and absorption. Finally, the samples must be obtained and preserved so that the chemicals which pose the greatest threat for human health In terms of toxicity and tissue persistence can be accurately measured. [Pg.12]

Statistical studies of MALDI MS applied to bacterial samples show that some biomarker peaks are highly reproducible and appear very consistently, while others appear much less reliably.1719 In Jarman et al.20 and Wahl et al.21 a probability model for MALDI signatures is proposed that takes into account the variability in appearance of biomarker peaks. This method constructs MALDI reference signatures from the set of peak locations for reproducible biomarker peaks, along with a measure of the reproducibility of each peak. [Pg.157]

All the viscoelastic measurements were carried out in the Rheometrics Dynamic Spectrometer RDS-770 at a frequency of 1Hz, a strain of 0.1%, and a temperature range of -140° to 140°C incremented every 2 degrees. The Texas Instrument Terminal Silent 700 was tapped to provide a hookup to an IBM 308X main frame computer located some miles away. The output of the Rheometrics unit was converted to a data file to be used in conjuc-tion with SAS (1). All statistical manipulations, software developments, and the necessary graphics that are reported here were carried out with the aid of SAS. [Pg.77]

The value placed on efficiency and predictability, and the institutional pressures for cost-containment, accountability and measurability are enhancing the appeal of reductionist theories. They fit with the tendency to locate social problems in individual pathology. They suit the actuarial mentality that places faith in statistical information as a means to predict and minimize future risk.7 Genetic and evolutionary explanations have become a way to address the issues that trouble society - the perceived decline of the family, the problems of crime and persistent poverty, changes in the ethnic structure of the population, and the pressures on public schools. [Pg.307]

The macroscopic property of interest, e.g., heat of vaporization, is represented in terms of some subset of the computed quantities on the right side of Eq. (3.7). The latter are measures of various aspects of a molecule s interactive behavior, with all but surface area being defined in terms of the electrostatic potential computed on the molecular surface. Vs max and Fs min, the most positive and most negative values of V(r) on the surface, are site-specific they indicate the tendencies and most favorable locations for nucleophilic and electrophilic interactions. In contrast, II, a ot and v are statistically-based global quantities, which are defined in terms of the entire molecular surface. II is a measure of local polarity, °fot indicates the degree of variability of the potential on the surface, and v is a measure of the electrostatic balance between the positive and negative regions of V(r) (Murray et al. 1994 Murray and Politzer 1994). [Pg.71]

The Student s (W.S. Gossett) /-lest is useful for comparisons of the means and standard deviations of different analytical test methods. Descriptions of the theory and use of this statistic are readily available in standard statistical texts including those in the references [1-6]. Use of this test will indicate whether the differences between a set of measurement and the true (known) value for those measurements is statistically meaningful. For Table 36-1 a comparison of METHOD B test results for each of the locations is compared to the known spiked analyte value for each sample. This statistical test indicates that METHOD B results are lower than the known analyte values for Sample No. 5 (Lab 1 and Lab 2), and Sample No. 6 (Lab 1). METHOD B reported value is higher for Sample No. 6 (Lab 2). Average results for this test indicate that METHOD B may result in analytical values trending lower than actual values. [Pg.183]

Extensive channeling measurements on 2H implanted into silicon have been published by Bech Nielsen (1988). These measurements also use the 3He-induced nuclear reaction in conjunction with extensive modeling using the statistical equilibrium model already described. The 2H implants were done at 30 K, and lattice location of the 2H was done as a function of annealing. [Pg.220]

Descriptive statistics are used to summarize the general nature of a data set. As such, the parameters describing any single group of data have two components. One of these describes the location of the data, while the other gives a measure of the dispersion of the data in and about this location. Often overlooked is the fact that the choice of which parameters are used to give these pieces of information implies a particular type of distribution for the data. [Pg.871]

The use of the mean with either the SD or SEM implies, however, that we have reason to believe that the sample of data being summarized are from a population that is at least approximately normally distributed. If this is not the case, then we should rather use a set of statistical descriptions which do not require a normal distribution. These are the median, for location, and the semiquartile distance, for a measure of dispersion. These somewhat less familiar parameters are characterized as follows. [Pg.871]

HE SYMPOSIUM UPON WHICH THIS VOLUME is based was organized originally because of the perpetual need to better formalize both understanding and error in the analytical methods used in quantitative analytical work. In this field, problem areas occur in sampling, recovery, and quantitative measurement. These analyses involve the production of numbers or data that describe quantitatively the system under scrutiny. Those who have been a part of this process know the locations of the various errors and have some idea of the size of the error. They may even run appropriate statistical tests to quantitatively determine the amount of error. [Pg.291]

Ozone and ozone precursor concentrations at nonurban locations in the eastern United States were studied extensively. The three parts of the study were field measurements, a quality assurance program, and an airborne monitoring program. The main objective of the study was to establish a data base for nonurban ozone and precursor concentrations. Simultaneous statistical summaries of the concentrations of nitrogen dioxide and nonmethane hydrocarbons were also provided. Another objective was to search for relationships between ozone concentrations and nitrogen dioxide and nonmethane hydrocarbon concentrations. [Pg.147]

The idea behind measures of location and central tendency is contained within the notion of the average. There are predominantly three summary statistics that are commonly used for describing this aspect of a set of data the arithmetic mean - normally shortened to the mean, the mode and the median. [Pg.280]

To test the applicability of statistical techniques for determination of the species contributions to the scattering coefficient, a one-year study was conducted in 1979 at China Lake, California. Filter samples of aerosol particles smaller than 2 ym aerodynamic diameter were analyzed for total fine mass, major chemical species, and the time average particle absorption coefficient, bg. At the same time and location, bgp was measured with a sensitive nephelometer. A total of 61 samples were analyzed. Multiple regression analysis was applied to the average particle scattering coefficient and mass concentrations for each filter sample to estimate aj and each species contribution to light scattering, bgn-j. Supplementary measurements of the chemical-size distribution were used for theoretical estimates of each b pj as a test of the effectiveness of the statistical approach. [Pg.128]

In summary, the variance in the measured fine particle scattering coefficient, bgp was dominated by sulfate concentrations. Organics and crustal species were much less important statistically. The inferred mass scattering effeciency for sulfates was intermediate between other desert values and Los Angeles. The statistical results for China Lake are quite similar to those by other investigators at other locations, even though only the fine aerosol was sampled at China Lake. [Pg.146]


See other pages where Location measures statistical is mentioned: [Pg.1801]    [Pg.165]    [Pg.379]    [Pg.30]    [Pg.101]    [Pg.49]    [Pg.79]    [Pg.307]    [Pg.1293]    [Pg.295]    [Pg.99]    [Pg.266]    [Pg.12]    [Pg.66]    [Pg.172]    [Pg.219]    [Pg.29]    [Pg.344]    [Pg.129]    [Pg.107]    [Pg.61]    [Pg.255]    [Pg.193]    [Pg.167]    [Pg.12]    [Pg.36]    [Pg.216]    [Pg.30]    [Pg.153]   


SEARCH



Statistical measure

Statistics location, measures

Statistics measures

© 2024 chempedia.info