Big Chemical Encyclopedia

Chemical substances, components, reactions, process design ...

Articles Figures Tables About

Statistical treatments

This section seeks to make a quantitative evaluation of the relation between the elastic force and elongation. The calculation requires determining the total entropy of the elastomer network as a function of strain. The procedure is divided into two stages first, the calculation of the entropy of a single chain, and second, the change in entropy of a network as a function of strain. [Pg.93]

Boltzmann s equation, S = kslnfil, relates the entropy to the number of conformations of the chain Q. Considering an element of constant volume dV, the number of conformations available to the chain is proportional to the probability per unit volume, p(x,y,z), multiplied by the size of the volume element, dV, Eq. (1.12) therefore, the entropy of the chain is given by the expression [Pg.93]

The work required to move the end of the chain from a distance r to another distance r + dr is equal to the change in Helmholtz energy and is [Pg.94]

Equation (3.15) provides the force f necessary to keep the chain with an end-to-end distance equal to r  [Pg.95]

Under equilibrium conditions, this force is equalled by the equivalent recovering force. When the external force is removed, the recovering force causes r to decrease spontaneously. From Eq. (3.16) it can be concluded that (1)/ is proportional to the temperature, so that as T increases the force needed to keep the chain with a certain value of r increases, and (2) the force is linearly elastic, i.e., proportional to.  [Pg.95]

When the underlying distribution is not known, tools such as histograms, probability curves, piecewise polynomial approximations, and general techniques are available to fit distributions to data. It may be necessary to assume an appropriate distribution in order to obtain the relevant parameters. Any assumptions made should be supported by manufacturer s data or data from the literature on similar items working in similar environments. Experience indicates that some probability distributions are more appropriate in certain situations than others. What follows is a brief overview on their applications in different environments. A more rigorous discussion of the statistics involved is provided in the CPQRA Guidelines.  [Pg.230]

Exponential distribution The exponential distribution is the simplest component life distribution. It is suited to model chance failures when the rate at which events occur remains constant over time. It is often suitable for the time between failures for repairable equipment. [Pg.230]

Normal distribution The normal distribution is the best known symmetric distribution, and two parameters completely describe the distribution. It often describes dimensions of parts made by automatic processes, natural and physical phenomena, and equipment that has increasing failure rates with time. [Pg.230]

Lognormal distribution Similar to a normal distribution. However, the logarithms of the values of the random variables are normally distributed. Typical applications are metal fatigue, electrical insulation life, time-to-repair data, continuous process (i.e., chemical processes) failure and repair data. [Pg.230]

Weibull distribution This distribution has been useful in a variety of reliability applications. The Weibull distribution is described by three parameters, and it can assume many shapes depending upon the values of the parameters. It can be used to model decreasing, increasing, and constant failure rates. [Pg.230]

A rubber-like solid is unique in that its physical properties resemble those of solids, liquids, and gases in various respects. It is solidlike in that it maintains dimensional stability, and its elastic response at small strains ( 5%) is essentially Hookean. It behaves like a liquid because its coefficient of thermal expansion and isothermal compressibility are of the same order of magnitude as those of liquids. The implication of this is that the intermolecular forces in an elastomer are similar to those in liquids. It resembles gases in the sense that the stress in a deformed elastomer increases with increasing temperature, much as the pressure in a compressed gas increases with increasing temperature. This gas-like behavior was, in fact, what first provided the hint that rubbery stresses are entropic in origin. [Pg.172]

The molecular model for the ideal gas is a collection of point masses in ceaseless, random, thermal motion, the motion of any two of the point masses being completely uncorrelated with one another. The counterpart to this in the case of the ideal elastomer is a collection of volumeless, long, flexible chains [Pg.172]

To compare the ideal elastomer with the ideal gas on a more quantitative basis, we note from equation (6-3) that [Pg.173]

In the formulation of the statistical theory of rubber elasticity,5 11 the following simplifying assumptions are made  [Pg.174]

The internal energy of the system is independent of the conformations of the individual chains. [Pg.174]


Orr W J C 1947 Statistical treatment of polymer solutions at infinite dilution Trans. Faraday Soc. 43 12-27... [Pg.2665]

The small statistical sample leaves strong fluctuations on the timescale of the nuclear vibrations, which is a behavior typical of any detailed microscopic dynamics used as data for a statistical treatment to obtain macroscopic quantities. [Pg.247]

It is widely used in experimental chemistry, most commonly in statistical treatment of experimental uneertainty (Young, 1962). For eonvenienee, it is eommon to make the substitution... [Pg.15]

Young, H. D., 1962. Statistical Treatment of Experimental Data. McGraw-Hill, New York. [Pg.338]

The precision of a result is its reproducibility the accuracy is its nearness to the truth. A systematic error causes a loss of accuracy, and it may or may not impair the precision depending upon whether the error is constant or variable. Random errors cause a lowering of reproducibility, but by making sufficient observations it is possible to overcome the scatter within limits so that the accuracy may not necessarily be affected. Statistical treatment can properly be applied only to random errors. [Pg.192]

The raw data collected during the experiment are then analyzed. Frequently the data must be reduced or transformed to a more readily analyzable form. A statistical treatment of the data is used to evaluate the accuracy and precision of the analysis and to validate the procedure. These results are compared with the criteria established during the design of the experiment, and then the design is reconsidered, additional experimental trials are run, or a solution to the problem is proposed. When a solution is proposed, the results are subject to an external evaluation that may result in a new problem and the beginning of a new analytical cycle. [Pg.6]

Lochmuler, C. Atomic Spectroscopy—Determination of Calcium and Magnesium in Sand with a Statistical Treatment of Measurements published on the web at http //www.chem.duke.edu/ clochmul/exp4/exp4.html. [Pg.225]

An alternative approach for collaborative testing is to have each analyst perform several replicate determinations on a single, common sample. This approach generates a separate data set for each analyst, requiring a different statistical treatment to arrive at estimates for Grand and Csys-... [Pg.693]

APEX full range of statistical treatments for QSAR MSI... [Pg.169]

The LIMS software is essentially a database for tracking, reporting, and archiving lab data as well as scheduling and guiding lab activities. Graphical and statistical treatment of data for improved process control (qv) as well as preparation of certificates of analysis (COA) for the customer are some of the other features of a comprehensive LIMS package (30). [Pg.368]

Accuracy and precision of the methods were checked by the added-found method and statistic treatment of the data of determinations (RSD ranged from 0.025 to 0.046). [Pg.394]

Certainly these approaches represent a progress in our understanding of the interfacial properties. All the phenomena taken into account, e.g., the coupling with the metal side, the degree of solvation of ions, etc., play a role in the interfacial structure. However, it appears that the theoretical predictions are very sensitive to the details of the interaction potentials between the various species present at the interface and also to the approximations used in the statistical treatment of the model. In what follows we focus on a small number of basic phenomena which, probably, determine the interfacial properties, and we try to use very transparent approximations to estimate the role of these phenomena. [Pg.805]

It should be noted that the data collection and conversion effort is not trivial, it is company and plant-specific and requires substantial effort and coordination between intracompany groups. No statistical treatment can make up for inaccurate or incomplete raw data. The keys to valid, high-quality data are thoroughness and quality of personnel training comprehensive procedures for data collection, reduction, handling and protection (from raw records to final failure rates) and the ability to audit and trace the origins of finished data. Finally, the system must be structured and the data must be coded so that they can be located within a well-designed failure rate taxonomy. When done properly, valuable and uniquely applicable failure rate data and equipment reliability information can be obtained. [Pg.213]

Some of the solvent assignments seem capricious, such as CCU in ARP, CF3COOH in AD, and CHCI3 and CftHjNHi in MISC. From one point of view, it is a triumph that a purely statistical treatment, devoid of chemical experience or intuition, can generate a scheme more or less in accord with chemical concepts. However, it does not seem to be superior to the simple application of chemical ideas. [Pg.399]

We now consider a type of analysis in which the data (which may consist of solvent properties or of solvent effects on rates, equilibria, and spectra) again are expressed as a linear combination of products as in Eq. (8-81), but now the statistical treatment yields estimates of both a, and jc,. This method is called principal component analysis or factor analysis. A key difference between multiple linear regression analysis and principal component analysis (in the chemical setting) is that regression analysis adopts chemical models a priori, whereas in factor analysis the chemical significance of the factors emerges (if desired) as a result of the analysis. We will not explore the statistical procedure, but will cite some results. We have already encountered examples in Section 8.2 on the classification of solvents and in the present section in the form of the Swain et al. treatment leading to Eq. (8-74). [Pg.445]

Soil burial tests are popular despite the precautions that are needed. It is also important that a sufficient number of specimens are exposed so that statistical treatment of the results may be applied to compensate for some of the inevitable variations in the exposure conditions. Certain precautions originally set out in 1937 are still valid, and are as follows ... [Pg.1077]

It is clear that nonconfigurational factors are of great importance in the formation of solid and liquid metal solutions. Leaving aside the problem of magnetic contributions, the vibrational contributions are not understood in such a way that they may be embodied in a statistical treatment of metallic solutions. It would be helpful to have measurements both of ACP and A a. (where a is the thermal expansion coefficient) for the solution process as a function of temperature in order to have an idea of the relative importance of changes in the harmonic and the anharmonic terms in the potential energy of the lattice. [Pg.134]

The development of methods for the kinetic measurement of heterogeneous catalytic reactions has enabled workers to obtain rate data of a great number of reactions [for a review, see (1, )]. The use of a statistical treatment of kinetic data and of computers [cf. (3-7) ] renders it possible to estimate objectively the suitability of kinetic models as well as to determine relatively accurate values of the constants of rate equations. Nevertheless, even these improvements allow the interpretation of kinetic results from the point of view of reaction mechanisms only within certain limits ... [Pg.1]

In our study we first investigated separately the kinetics of the hydrogenation of phenol and of the hydrogenation of cyclohexanone (7), and from twenty-six different equations, using statistical treatment of the data, we found the best equations for the initial reaction rates to be... [Pg.32]

The simple statistical treatment of radical polymerization can be traced back to Schultz27 Texts by Flory2S and Bamford et al.29 are useful references. [Pg.240]

By a statistical treatment of the kinetic data, Scheme (8-10) can be shown to fit the experimental results better than all the other mechanisms considered (Maurer et al., 1979). [Pg.172]

The 15N content was indeed lower when the experiment was performed This result justified the publication of a preliminary communication (Bergstrom et al., 1974). Later work (Hashida et al., 1978 Szele and Zollinger, 1978a Maurer et al., 1979) involving sophisticated statistical treatments suggested that, in a weakly nucleophilic solvent such as trifluoroethanol, the phenyl cation is formed in two steps and not in one, as in mechanism B (see Scheme 8-4 in Sec. 8.3), the first intermediate being a tight ion-molecule pair. [Pg.217]

Available kinetic data are seldom of sufficiently high quality to warrant the application of high precision statistical treatment. This point is made forcefully by Churchill [495] who states Our ability and inclination to postulate and construct models appear to exceed our ability and inclination to obtain good rate data. Improvement in rate correlations will come primarily from more and better measurements rather than from improvements in modelling or mathematical procedures. ... [Pg.83]

The value of the product i j has been variously given by different workers, from statistical treatment of the velocities of the molecules/0 a value of 0.5 will be taken. [Pg.698]

ABSTRACT The statistical treatment of resonating covalent bonds in metals, previously applied to hypoelectronic metals, is extended to hyperelectronic metals and to metals with two kinds of bonds. The theory leads to half-integral values of the valence for hyperelectronic metallic elements. [Pg.407]

A theory of resonating covalent bonds in metals, developed over the period 1938-1953 (1-3), was recently refined by the formulation of a statistical treatment for hypoelectronic metals (4). We have now extended the statistical treatment to include hyperelectronic metals. This extension has resulted not only in the evaluation of the number of resonance structures for these metals but also in the determination for them of the values of the metallic valence, which have been somewhat uncertain. [Pg.407]

The problem of relationship between the activation parameters-the so called isokinetic relationship or compensation law—is of fundamental importance in structural chemistry, organic or inorganic. However, there are few topics in which so many misunderstandings and controversies have arisen as in connection with this problem. A critical review thus seems appropriate at present, in order to help in clarifying ideas and to draw attention to this treatment of kinetic or equilibrium data. The subject has already been reviewed (1-6), but sufficient attention has not been given to the statistical treatment which represents the heaviest problems. In this review, the statistical problems are given the first place. Theoretical corollaries are also dealt with, but no attempt was made to collect all examples from the literature. It is hoped that most of the important... [Pg.413]

Several doubts about the correctness of the usual statistical treatment were expressed already in the older literature (31), and later, attention was called to large experimental errors (142) in AH and AS and their mutual dependence (143-145). The possibility of an apparent correlation due only to experimental error also was recognized and discussed (1, 2, 4, 6, 115, 116, 119, 146). However, the full danger of an improper statistical treatment was shown only by this reviewer (147) and by Petersen (148). The first correct statistical treatment of a special case followed (149) and provoked a brisk discussion in which Malawski (150, 151), Leffler (152, 153), Palm (3, 154, 155) and others (156-161) took part. Recently, the necessary formulas for a statistical treatment in common cases have been derived (162-164). The heart of the problem lies not in experimental errors, but in the a priori dependence of the correlated quantities, AH and AS. It is to be stressed in advance that in most cases, the correct statistical treatment has not invalidated the existence of an approximate isokinetic relationship however, the slopes and especially the correlation coefficients reported previously are almost always wrong. [Pg.419]

The natural and correct form of the isokinetic relationship is eq. (13) or (13a). The plot, AH versus AG , has slope Pf(P - T), from which j3 is easily obtained. If a statistical treatment is needed, the common regression analysis can usually be recommended, with AG (or logK) as the independent and AH as the dependent variable, since errors in the former can be neglected. Then the overall fit is estimated by means of the correlation coefficient, and the standard deviation from the regression line reveals whether the correlation is fulfilled within the experimental errors. [Pg.453]

Practically all values of 3 within the experimental interval claimed in the literature (1-5, 115-119, 153) have been shown to be artifacts (148, 149, 163) resulting from improper statistical treatment (see Sec. IV). Petersen thus believed (148) that actually no such value had been reported, and the meaning was offered that the isokinetic temperature probably is not accessible experimentally (149, 188). This view was supported by the existence of negative... [Pg.456]


See other pages where Statistical treatments is mentioned: [Pg.242]    [Pg.298]    [Pg.128]    [Pg.230]    [Pg.978]    [Pg.31]    [Pg.384]    [Pg.235]    [Pg.364]    [Pg.98]    [Pg.133]    [Pg.402]    [Pg.738]    [Pg.823]    [Pg.411]    [Pg.428]    [Pg.428]    [Pg.439]    [Pg.453]   
See also in sourсe #XX -- [ Pg.320 ]




SEARCH



Adsorption Statistical mechanics treatment

Bayesian Approach to Statistical Treatment

Chemical equilibrium statistical treatment

Fibers statistical treatment

Introduction to statistical treatment

Mathematical treatments statistical thermodynamics

Quantum-statistical treatments

Random errors statistical treatment

Reference values statistical treatment

Simple Statistical Treatment of Liquids and Gases

Statistical Data Treatment and Evaluation

Statistical Theory Treatment

Statistical Treatment of Bronsted Plots

Statistical Treatment of Data

Statistical Treatment of Diamine-diepoxide Curing

Statistical Treatment of Entropy

Statistical Treatment of Interacting Systems

Statistical Treatment of Isotherms

Statistical analysts treatment

Statistical mechanical treatment

Statistical treatment of fiber strength

Statistical treatment of finite samples

Statistical treatment of free sorting data

Statistical treatment of polycondensation reactions

Statistical treatment of polymerization problems

Statistical treatment of random

Statistical treatment of random errors

Statistics treatment difference estimation

Treatments statistics

Treatments statistics

© 2024 chempedia.info