Another consideration when using the approach is the assumption that stress and strength are statistically independent however, in practical applications it is to be expected that this is usually the case (Disney et al., 1968). The random variables in the design are assumed to be independent, linear and near-Normal to be used effectively in the variance equation. A high correlation of the random variables in some way, or the use of non-Normal distributions in the stress governing function are often sources of non-linearity and transformations methods should be considered. [Pg.191]

Before a probabilistic model can be developed, the variables involved must be determined. It is assumed that the variables all follow the Normal distribution and that they are statistically independent, i.e. not correlated in anyway. The scatter of the pre-load, F, using an air tool with a clutch is approximately 30% of the mean, which gives the coefficient of variation, = 0.1, assuming 3cr covers this range, therefore ... [Pg.206]

Evans, R, A. 1975, Statistical Independence and Common-Mode Failures, IEEE Trans. Rel., R-24, p289. [Pg.478]

While static Monte Carlo methods generate a sequence of statistically independent configurations, dynamic MC methods are always based on some stochastic Markov process, where subsequent configurations X of the system are generated from the previous configuration X —X —X" — > with some transition probability IF(X —> X ). Since to a large extent the choice of the basic move X —X is arbitrary, various methods differ in the choice of the basic unit of motion . Also, the choice of transition probability IF(X — > X ) is not unique the only requirement is that the principle... [Pg.561]

Like all other methods for analyzing censored failure data, the hazard plotting method is also based on a certain assumption that must be satisfied if we are going to rely on the results. The assumption is that if the unfailed units were mn to failure, their failure times would be statistically independent of their censoring times. In other words, there is no relationship or correlation between the censoring time of a unit and the failure time. For example. [Pg.1049]

Although it is obviously impossible to enumerate all possible configurations for infinite lattices, so long as the values of far separated sites are statistically independent, the average entropy per site can nonetheless be estimated by a limiting procedure. To this end, we first generalize the definitions for the spatial set and spatial measure entropies given above to their respective block-entropy forms. [Pg.216]

If T —> oo with B fixed, the contribution by the length-B horizontal segment of the outer lined area in figure 4.15 can be effectively ignored. Noting that the entropy is largest when the 2r vertical columns are statistically independent, we find that... [Pg.223]

Second Derivation of the Boltzmann Equation.—The derivation of the Boltzmann equation given in the first sections of this chapter suffers from the obvious defect that it is in no way connected with the fundamental law of statistical mechanics, i.e., LiouviUe s equation. As discussed in Section 12.6of The Mathematics of Physics and Chemistry, 2nd Ed.,22 the behavior of all systems of particles should be compatible with this equation, and, thus, one should be able to derive the Boltzmann equation from it. This has been avoided in the previous derivation by implicitly making statistical assumptions about the behavior of colliding particles that the number of collisions between particles of velocities v1 and v2 is taken proportional to /(v.i)/(v2) implies that there has been no previous relation between the particles (statistical independence) before collision. As noted previously, in a... [Pg.41]

When this relationship holds, X(t + r) and Y(t + r ) are said to be statistically independent. Statistical independence will be discussed in detail in Section 3.10. [Pg.142]

Conditional Distribution Functions and Statistical Independence.—The definition of a conditional distribution function is motivated by the following considerations. Suppose that we have been observing a time function X and that we want to obtain a quanti-... [Pg.148]

The discussion at the beginning of this section makes it natural for us to say that the event n in Jln is statistically independent of the event >m in Bm if... [Pg.153]

In the light of the preceding discussion, it is tempting to extend the notion of statistical independence by calling a group of three or more events statistically independent if... [Pg.153]

In other words, the fact that a group of n events is statistically independent according to definition (3-176) does not necessarily imply... [Pg.153]

The same motivation that leads us to our definition of statistically independent events also leads us to call a set of n random variables < > , >4>n statistically independent if, and only if... [Pg.154]

One important consequence of this definition is that if

Perhaps the most important property of statistically independent random variables is embodied in the following, easily verified, formula that is valid when - are statistically independent.39... [Pg.154]

Equation (3-179) states the quite remarkable result that the expectation of a product of statistically independent random variables is equal to the product of their individual expectations. [Pg.154]

The converse of this statement is not true in general random variables satisfying (3-179) need not be statistically independent. [Pg.154]

Sums of Independent Random Variables.—Sums of statistically independent random variables play a very important role in the theory of random processes. The reason for this is twofold sums of statistically independent random variables turn out to have some rather remarkable mathematical properties and, moreover, many physical quantities, such as thermal noise voltages or measurement fluctuations, can be usefully thought of as being sums of a large number of small, presumably independent quantities. Accordingly, this section will be devoted to a brief discussion of some of the more important properties of sums of independent random variables. [Pg.155]

Let denote an (infinite) family of statistically independent random variables384 and define... [Pg.155]

This equation is valid even if the random variables are not statistically independent. [Pg.155]

The last equation establishes the important result that the variance of a sum of statistically independent random variables is the sum of the variances of the summands. [Pg.156]

Another important result states that the characteristic function of a sum of statistically independent random variables is the product of the characteristic functions of the individual summands. The reader should compare this statement with the deceptively similar sounding one made on page 154, and carefully note the difference between the two. The proof of this statement is a simple calculation... [Pg.156]

One consequence of Eq. (3-185) is the important result that the sum of a family of statistically independent, gaussianly distributed random variables is again gaussianly distributed. To show this, let... [Pg.156]

Our next result concerns the central limit theorem, which places in evidence the remarkable behavior of the distribution function of when n is a large number. We shall now state and sketch the proof of a version of the central limit theorem that is pertinent to sums of identically distributed [p0i(x) = p01(a ), i — 1,2, ], statistically independent random variables. To simplify the statement of the theorem, we shall introduce the normalized sum s defined by... [Pg.157]

The Central Limit Theorem.—If 4>i,4>a, we identically distributed, statistically independent random variables having finite mean and variance, then... [Pg.157]

The central limit theorem thus states the remarkable fact that the distribution function of the normalized sum of identically distributed, statistically independent random variables approaches the gaussian distribution function as the number of summands approaches infinity—... [Pg.157]

Finally, an infinite set of random vectors is defined to be statistically independent if all finite subfamilies are statistically independent. Given an infinite family of identically distributed, statistically independent random vectors having finite means and covariances, we define their normalized sum to be the vector sfn, , sj where... [Pg.160]

We conclude this section by deriving an important property of jointly gaussian random variables namely, the fact that a necessary and sufficient condition for a group of jointly gaussian random variables 9i>- >< to be statistically independent is that E[cpjCpk] = j k. Stated in other words, linearly independent (uncorrelated),46 gaussian random variables are statistically independent. This statement is not necessarily true for non-gaussian random variables. [Pg.161]

The fact that linear independence is a necessary condition for statistical independence is obvious. The sufficiency of the condition can be established by noting that the covariance matrix... [Pg.161]

On the other hand, Eq. (3-233) states that A is the sum of two statistically independent, Poisson distributed random variables Ax and Aa with parameters n(t2 — tj) and n tx — t2) respectively. Consequently,49 A must be Poisson distributed with parameter n(t2 — tx) + n(t3 — t2) = n(t3 — tx) which checks our direct calculation. The fact that the most general consistency condition of the type just considered is also met follows in a similar manner from the properties of sums of independent, Poisson distributed random variables. [Pg.167]

In other words, if we assume that the counting function N(t) has statistically independent increments (Eq. (3-237)), and has the property that the probability of a single jump occurring in a small interval of length h is approximately nh but the probability of more than one jump is zero to within terms of order h, (Eq. (3-238)), then it can be shown 51 that its probability density functions must be given by Eq. (3-231). It is the existence of theorems of this type that accounts for the great... [Pg.168]

See also in sourсe #XX -- [ Pg.384 ]

© 2019 chempedia.info