Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Statistics correlation coefficient

A linear correlation of the average values from the two testing machines showed excellent agreement. A calculated correlation equation is P = 0.08A/ + 6.9 (r = 0.999). P stands for the value on the presently approved constant stress testing machine and N for the value by the new constant strain machine, and r for the statistical correlation coefficient. [Pg.8]

Table 2. Statistical Correlation Coefficients r for do First Line of Each Row) and df... Table 2. Statistical Correlation Coefficients r for do First Line of Each Row) and df...
Sources and paths responsible for vibration of a receiver frequently can be identified by correlation measurements. These involve simultaneous measurement of the vibrations at the receiver and a source (or path) and calculation of the corresponding statistical correlation coefficient or coherence. For sources or paths that contribute significantly to the vibration at the receiver, high correlation (or a coherence value near unity) is obtained, whereas for sources or paths that make insignificant contributions, low correlation is obtained. Most modem spectrum analyzers are equipped to carry out correlation or coherence calculations automatically. [Pg.441]

The QSPR results through all energy and electronegativity and chemical hardness combined methods are presented in Table 3.6, emphasizing both the degree of parabolic dependence, i.e., the closeness c/V- — -l, and the consecrated statistical correlation coefficient (/ ). [Pg.381]

TABLE 3.38 The Parameters and Statistical Correlation Coefficients for the Residual-QSAR Algorithm of Eqs. (3.139) and (3.140), As Applied To the Molecules of Table 3.36 in All Possible Combinations of Variables (Putz, 2011a)... [Pg.398]

Correlation analysis reveals the interdependence between variables. The statistical measure for the interdependence is the correlation coefficient. [Pg.481]

The Hammett equation is said to be followed when a plot of log k against a is linear. Most workers take as the criterion of linearity the correlation coefficient r, which is required to be at least 0.95 and preferably above 0.98. A weakness of r as a statistical measure of goodness of fit is that r is a function of the slope p if the slope is zero, the correlation coefficient is zero. A slope of zero in an LEER is a chemically informative result, for it demonstrates an absence of a substituent... [Pg.318]

Experience has shown that correlations of good precision are those for which SD/RMS. 1, where SD is the root mean square of the deviations and RMS is the root mean square of the data Pfs. SD is a measure equal to, or approaching in the limit, the standard deviation in parameter predetermined statistics, where a large number of data points determine a small number of parameters. In a few series, RMS is so small that even though SD appears acceptable, / values do exceed. 1. Such sets are of little significance pro or con. Evidence has been presented (2p) that this simple / measure of statistical precision is more trustworthy in measuring the precision of structure-reactivity correlations than is the more conventional correlation coefficient. [Pg.16]

Several doubts about the correctness of the usual statistical treatment were expressed already in the older literature (31), and later, attention was called to large experimental errors (142) in AH and AS and their mutual dependence (143-145). The possibility of an apparent correlation due only to experimental error also was recognized and discussed (1, 2, 4, 6, 115, 116, 119, 146). However, the full danger of an improper statistical treatment was shown only by this reviewer (147) and by Petersen (148). The first correct statistical treatment of a special case followed (149) and provoked a brisk discussion in which Malawski (150, 151), Leffler (152, 153), Palm (3, 154, 155) and others (156-161) took part. Recently, the necessary formulas for a statistical treatment in common cases have been derived (162-164). The heart of the problem lies not in experimental errors, but in the a priori dependence of the correlated quantities, AH and AS. It is to be stressed in advance that in most cases, the correct statistical treatment has not invalidated the existence of an approximate isokinetic relationship however, the slopes and especially the correlation coefficients reported previously are almost always wrong. [Pg.419]

The natural and correct form of the isokinetic relationship is eq. (13) or (13a). The plot, AH versus AG , has slope Pf(P - T), from which j3 is easily obtained. If a statistical treatment is needed, the common regression analysis can usually be recommended, with AG (or logK) as the independent and AH as the dependent variable, since errors in the former can be neglected. Then the overall fit is estimated by means of the correlation coefficient, and the standard deviation from the regression line reveals whether the correlation is fulfilled within the experimental errors. [Pg.453]

However, it is not proper to apply the regression analysis in the coordinates AH versus AS or AS versus AG , nor to draw lines in these coordinates. The reasons are the same as in Sec. IV.B., and the problem can likewise be treated as a coordinate transformation. Let us denote rcH as the correlation coefficient in the original (statistically correct) coordinates AH versus AG , in which sq and sh are the standard deviations of the two variables from their averages. After transformation to the coordinates TAS versus AG or AH versus TAS , the new correlation coefficients ros and rsH. respectively, are given by the following equations. (The constant T is without effect on the correlation coefficient.)... [Pg.453]

For the data the squared correlation coefficient was 0.93 with a root mean square error of 2.2. The graph of predicted versus actual observed MS(1 +4) along with the summary of fit statistics and parameter estimates is shown in Figure 16.7. [Pg.494]

There are two statistical assumptions made regarding the valid application of mathematical models used to describe data. The first assumption is that row and column effects are additive. The first assumption is met by the nature of the smdy design, since the regression is a series of X, Y pairs distributed through time. The second assumption is that residuals are independent, random variables, and that they are normally distributed about the mean. Based on the literature, the second assumption is typically ignored when researchers apply equations to describe data. Rather, the correlation coefficient (r) is typically used to determine goodness of fit. However, this approach is not valid for determining whether the function or model properly described the data. [Pg.880]

A statistical test of these data gave a correlation coefficient of 0.968 (19), with only 0.834 required for 1% significance. Thus the elimination rates are functions dependent... [Pg.186]

They include simple statistics (e.g., sums, means, standard deviations, coefficient of variation), error analysis terms (e.g., average error, relative error, standard error of estimate), linear regression analysis, and correlation coefficients. [Pg.169]


See other pages where Statistics correlation coefficient is mentioned: [Pg.704]    [Pg.494]    [Pg.150]    [Pg.1684]    [Pg.22]    [Pg.83]    [Pg.704]    [Pg.494]    [Pg.150]    [Pg.1684]    [Pg.22]    [Pg.83]    [Pg.490]    [Pg.715]    [Pg.285]    [Pg.319]    [Pg.134]    [Pg.439]    [Pg.92]    [Pg.93]    [Pg.127]    [Pg.45]    [Pg.247]    [Pg.331]    [Pg.187]    [Pg.155]    [Pg.53]    [Pg.54]    [Pg.55]    [Pg.374]    [Pg.57]    [Pg.123]    [Pg.154]    [Pg.155]   
See also in sourсe #XX -- [ Pg.212 ]




SEARCH



Coefficient correlation

Descriptive statistics correlation coefficient

Statistical correlation

Statistics correlation

© 2024 chempedia.info