Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Intensive Statistical Methods

Estimation of any statistic, like a model parameter or sample mean for example, has some degree of uncertainty associated with it. Little confidence is placed in statistics with a high degree of uncertainty. This [Pg.353]

Parametric methods for estimating the Cl of a statistic require the assumption of a sampling distribution for the statistic and then some way to calculate the parameters of that distribution. For example, the sampling distribution for the sample mean is a normal distribution having mean p, which is estimated by the sample mean x, and standard deviation equal to the standard error of the mean SE (x), which is calculated using [Pg.354]

Sometimes, the distribution of the statistic must be derived under asymptotic or best case conditions, which assume an infinite number of observations, like the sampling distribution for a regression parameter which assumes a normal distribution. However, the asymptotic assumption of normality is not always valid. Further, sometimes the distribution of the statistic may not be known at all. For example, what is the sampling distribution for the ratio of the largest to smallest value in some distribution Parametric theory is not entirely forthcoming with an answer. The bootstrap and jackknife, which are two types of computer intensive analysis methods, could be used to assess the precision of a sample-derived statistic when its sampling distribution is unknown or when asymptotic theory may not be appropriate. [Pg.354]

As might be surmised, computer intensive statistical analysis methods have become more popular and useful with the advent of modern personal computers having [Pg.354]

1 Many texts often define the Cl using some blend of the Bayesian and frequentist interpretation, although most people in practice use a Bayesian interpretation despite having used a frequentist method to calculate the Cl. [Pg.354]


Urban Hjorth, J.S. (1994). Computer Intensive Statistical Methods, Chapman and Hall, London,... [Pg.451]

More recently, statistical or direct methods have achieved a spectacular success. The basis of the method is that the lost information on the phase angle can be derived from the X-ray intensities. Two large structures thus solved by statistical methods are the dichloromethane-anthracene-tetracyanoethylene complex,4 namely, (8 C2oH10N4,2CH2C12-C6N4), and 8,14-anhydrodigitoxigenin.5... [Pg.55]

Taste. Of the fundamental tastes, bitter is unique in showing human genetic differences in sensitivity. Six decades ago, it was reported tiiat phenylthiocarbamide (PTC) tasted extremely bitter to some individuals while being almost tasteless to others (45). Tlie ability to taste PTC was found to be a dominant genetic trait which occurs across gender, age and culture, with 70% of the American pulation carrying the dominant trait (46). Sensitivity to PTC and propylthiouracil (PROP) are correlated with sensitivities to otiier bitter tasting compounds, such as caffeine, saccharin (after-taste) and salts of i tassium cations and benzoate anions (47,48,49 0). However, in a reexamination of the sensitivity to NaQ and KQ, no differences were found between tasters and nontasters to non-PTC type compounds, and the statistical methods that showed differences were questioned (51). Individuals who do not respond to PTC are not necessarily insensitive to quinine, another intensely bitter compound (49,50,52). [Pg.19]

During the last two or three decades, chemists became used to the application of computers to control their instruments, develop analytical methods, analyse data and, consequently, to apply different statistical methods to explore multivariate correlations between one or more output(s) (e.g. concentration of an analyte) and a set of input variables (e.g. atomic intensities, absorbances). [Pg.244]

Delaney [13] describes the solubilization mechanism as controlled by a double phenomenon the affinity of the compound for itself and the affinity of the compound for the solvent. The latter effect is simply described either by the log P property or by very sophisticated methods such as statistical thermodynamic or quantum mechanical techniques. These very intensive calculation methods have not yet proved their superiority over the simpler and faster methods that tend to mimic the successful log P fragment calculator. [Pg.58]

In the sections below a brief outline is given of how experimental data can be used to fit a response surface model and how statistical methods can be used to evaluate the results. The author s intensions in this introductory chapter is to give the reader a feeling for how statistical tools can be used in an experimental context. Detailed descriptions follow in subsequent chapters. [Pg.50]

Having collected all needed intensity data under the most favorable conditions possible, the crystallographer processes the data, applying absorption corrections and, if necessary, corrections for decomposition of the crystal, and arrives at his data set, consisting of the values of F j / or F / 2, either unsealed or with a rough scale factor calculated by statistical methods. Each datum should be accompanied by a standard deviation o that represents random error (and possible random effects of systematic errors) as derived, for example, with Eq. (13). [Pg.175]

However, recently, the technique has come under intense scrutiny. Some scientists have questioned the statistical methods used to calculate the odds of the individuality of a DNA fingerprint, concerned that within particular ethnic groups the variability may be much lower than was previously claimed. Others argued that slightly different methodologies used in dif-... [Pg.743]

Some measure of the reliability of all these statistical methods can be obtained by remnning the programs with different initial conditions. It emerges that in general the location of peaks in the reconstruction is well reproduced, but relative intensities can sometimes vary appreciably. The possibility of false or missing correlations suggests that, in principle, the aforementioned deterministic schemes may be preferable. [Pg.16]

Other tests based on intensity statistics have been proposed (e.g. see Rees, 1980 Yeates, 1997 (www.doe-mbi.ucla.edu/Services/Twinning) Kahlenberg, 1999 Kahlenberg and Messner, 2001). Nevertheless, the use of the mean E — 11 is particularly simple, because it is a single number and is often calculated by data reduction and direct methods programs. [Pg.118]

Measurement of the electrochemical current noise is aimed at correlating the observed current fluctuations with breakdown and repair events that might lead to the formation of stable growing pits [53, 54], In view of this mechanistic interpretation, the application of statistical methods to the occurrence of current spikes and the observed probability of pit formation lead to a stochastic model for pit nucleation. The evaluation of current spikes in the time and frequency domain yields parameters such as the intensity of the stochastic process X and the repassivation rate r [53]. They depend on parameters such as the potential, state of the passive layer, and concentration of aggressive anions. [Pg.335]


See other pages where Intensive Statistical Methods is mentioned: [Pg.341]    [Pg.353]    [Pg.341]    [Pg.353]    [Pg.1]    [Pg.2]    [Pg.311]    [Pg.106]    [Pg.267]    [Pg.302]    [Pg.134]    [Pg.93]    [Pg.505]    [Pg.117]    [Pg.302]    [Pg.335]    [Pg.265]    [Pg.512]    [Pg.534]    [Pg.178]    [Pg.302]    [Pg.2796]    [Pg.322]    [Pg.188]    [Pg.393]    [Pg.124]    [Pg.618]    [Pg.186]    [Pg.250]    [Pg.284]    [Pg.34]    [Pg.500]    [Pg.308]    [Pg.122]    [Pg.259]    [Pg.301]    [Pg.48]   


SEARCH



Computer intensive statistical methods

Intensity statistics

Statistical methods

© 2024 chempedia.info